On Mon, Jan 16, 2017 at 9:23 PM, Doug McIlroy <doug(a)cs.dartmouth.edu> wrote:
ctal was reinforced by a decade of 6-bit bytes.
Perhaps the real question is why did IBM break so completely to hex
for the 360?
I may be able to help a little here. A few years ago I used work with
Russ Robelen who was the Chief Designer of the 360/50. Russ regaled us
with a number of the stories from those times and having met a few of the
personalities involved in them I tend to believe Russ's stories as I have
heard other of similar color.
The first important thing about the 360 was it was supposed to be an the
first ASCII machine from IBM. It's funny how history would use prove it
otherwise, but IBM had invested heavily and originally planned on going
ASCII. And the key is that ASCII was originally a 7-bit character set,
being able to store a 7 bit character was an important design idea for the
architecture.
According to Russ, Amdahl came up with some [IMO hokey] schemes (similar to
what CDC would do) that mapped into 6 bits. I understand that he even
proposed a 7-bit byte. He felt that 8 bits was wasteful of the hardware.
Russ says that, Brooks would toss him out of his office and said something
on the order don't come back until you have a power of 2 - that he (Brooks)
did not know how to program things in multiples of 3 and things made of 7s
were ever worse.