From: Doug McIlroy <doug(a)cs.dartmouth.edu>
Core memory wiped out competing technologies (Williams
tube, mercury
delay line, etc) almost instantly and ruled for over twent years.
I never lived through that era, but reading about it, I'm not sure people now
can really fathom just how big a step forward core was - how expensive, bulky,
flaky, low-capacity, etc, etc prior main memory technologies were.
In other words, there's a reason they were all dropped like hot potatoes in
favour of core - which, looked at from our DRAM-era perspective, seems
quaintly dinosaurian. Individual pieces of hardware you can actually _see_
with the naked eye, for _each_ bit? But that should give some idea of how much
worse everything before it was, that it killed them all off so quickly!
There's simply no question that without core, computers would not have
advanced (in use, societal importance, technical depth, etc) at the speed they
did without core. It was one of the most consequential steps in the
development of computers to what they are today: up there with transistors,
ICs, DRAM and microprocessors.
Yet late in his life Forrester told me that the
Whirlwind-connected
invention he was most proud of was marginal testing
Given the above, I'm totally gobsmacked to hear that. Margin testing was
important, yes, but not even remotely on the same quantum level as core.
In trying to understand why he said that, I can only suppose that he felt that
core was 'the work of many hands', which it was (see e.g. "Memories That
Shaped an Industry", pg. 212, and the things referred to there), and so he
only deserved a share of the credit for it.
Is there any other explanation? Did he go into any depth as to _why_ he felt
that way?
Noel