I believe that the PDP-11 ISA was defined at a time when DEC was still using
random logic rather than a control store (which came pretty soon
thereafter). Given a random logic design it's efficient to organize the ISA
encoding to maximize its regularity. Probably also of some benefit to
compilers in a memory-constrained environment?
I'm not sure at what point in time we can say "lots of processors" had
moved
to a control store based implementation. Certainly the IBM System/360 was
there in the mid-60's. HP was there by the late 60's.
-----Original Message-----
From: TUHS <tuhs-bounces(a)minnie.tuhs.org> On Behalf Of Larry McVoy
Sent: Monday, November 29, 2021 10:18 PM
To: Clem Cole <clemc(a)ccc.com>
Cc: TUHS main list <tuhs(a)minnie.tuhs.org>; Eugene Miya <eugene(a)soe.ucsc.edu>
Subject: Re: [TUHS] A New History of Modern Computing - my thoughts
On Sun, Nov 28, 2021 at 05:12:44PM -0800, Larry McVoy wrote:
I remember Ken Witte (my TA for the PDP-11 class)
trying to get me to
see how easy it was to read the octal. If I remember correctly (and I
probably don't, this was ~40 years ago), the instructions were divided
into fields, so instruction, operand, operand and it was all regular,
so you could see that this was some form of an add or whatever, it got
the values from these registers and put it in that register.
I've looked it up and it is pretty much as Ken described. The weird thing
is that there is no need to do it like the PDP-11 did it, you could use
random numbers for each instruction and lots of processors did pretty much
that. The PDP-11 didn't, it was very uniform to the point that Ken's
ability to read octal made perfect sense. I was never that good but a
little google and reading and I can see how he got there.
...
--lm