A question about 36 bit machines....
In some of the historical accounts I've read, it seems that before the
PDP-11 a pitch was made for a PDP-10 to support the then-nascent Unix
efforts. This was shot down by labs management and sometime later the
PDP-11 arrived and within a decade or so the question of byte width was the
creatively settled for general purpose machines.
The question then is twofold: why a PDP-10 in the early 70s (instead of,
say, a 360 or something) and why later the aversion to word-oriented
machines? The PDP-7 was of course word oriented.
I imagine answers have to do with cost/performance for the former and with
regard to the latter, a) the question was largely settled by the middle of
the decade, and b) by then Unix had evolved so that a port was considered
rather different than a rewrite. But I'd love to hear from some of the
players involved.
- Dan C.
On Jan 18, 2017 10:06 AM, "Steve Johnson" <scj(a)yaccman.com> wrote:
When we were considering what machine to port PDP-11
Unix to, there were
several 36-bit machines around and some folks were lobbying for them.
Dennis' comment was quite characteristically succinct: "I'll consider it
if
they throw in a 10-track tape drive...". Just thinking about Unix (and
C!) on a machine where the byte size does not evenly divide the word size
is pretty painful...
(Historical note: before networking, magnetic tapes were essential for
backups and moving large quantities of data. Data was stored in magnetic
dots running across the tape, and typically held a character plus a parity
bit. Thus, there were 7-track drives for 6-bit machines, and 9-track
drives for 8-bit machines. But nothing for 9-bit machines...)
----- Original Message -----
From:
"jnc(a)mercury.lcs.mit.edu (Noel" <Chiappa)>
To:
<tuhs(a)minnie.tuhs.org>
Cc:
<jnc(a)mercury.lcs.mit.edu>
Sent:
Tue, 17 Jan 2017 21:33:58 -0500 (EST)
Subject:
Re: [TUHS] [TUHS} PDP-11, Unix, octal?
From: Doug McIlroy
Perhaps the real question is why did IBM break so
completely to hex for
the 360?
Probably because the 360 had 8-bit bytes?
Unless there's something like the PDP-11 instruction format which makes
octal
optimal, octal is a pain working with 8-bit bytes; anytime you're looking
at
the higher bytes in a word, unless you are working through software which
will 'interpret' the bytes for you, it's a PITA.
The 360 instruction coding doesn't really benefit from octal (well,
instructions are in 4 classes, based on the high two bits of the first
byte,
but past that, hex works better); opcodes are 8 or 16 bits, and register
numbers are 4 bits.
As to why the 360 had 8-bit bytes, according to "IBM's 360 and Early 370
Systems" (Pugh, Johnson, and Palmer, pp. 148-149), there was a big fight
over
whether to use 6 or 8, and they finally went with 8 because i) statistics
showed that more customer data was numbers, rather than text, and storing
decimal numbers in 6-bit bytes was inefficient (BCD does two digits per
8-bit
byte), and ii) they were looking forward to handling text with upper- and
lower-case.
Noel