From: Doug McIlroy
Perhaps the real question is why did IBM break so
completely to hex for
the 360?
Probably because the 360 had 8-bit bytes?
Unless there's something like the PDP-11 instruction format which makes octal
optimal, octal is a pain working with 8-bit bytes; anytime you're looking at
the higher bytes in a word, unless you are working through software which
will 'interpret' the bytes for you, it's a PITA.
The 360 instruction coding doesn't really benefit from octal (well,
instructions are in 4 classes, based on the high two bits of the first byte,
but past that, hex works better); opcodes are 8 or 16 bits, and register
numbers are 4 bits.
As to why the 360 had 8-bit bytes, according to "IBM's 360 and Early 370
Systems" (Pugh, Johnson, and Palmer, pp. 148-149), there was a big fight over
whether to use 6 or 8, and they finally went with 8 because i) statistics
showed that more customer data was numbers, rather than text, and storing
decimal numbers in 6-bit bytes was inefficient (BCD does two digits per 8-bit
byte), and ii) they were looking forward to handling text with upper- and
lower-case.
Noel