I did send this already, from 1946, did you see it?
The famous 1946 paper, "Preliminary discussion of the logical design of
an electronic computing device", by Arthur Burks, Herman H. Goldstine,
John von Neumann, contains this sentence. I have this paper in Computer
Structures: Readings and Examples, by Bell and Newell, but it's also
online in many forms
4. The memory organ
4.1. Ideally one would desire an indefinitely large memory capacity such
that any particular aggregate of 40 binary digits, or -word- (cf. 2.3),
would be immediately available-i.e. in a time which is somewhat or
considerably shorter than the operation time of a fast electronic
multiplier.
[word is in italics]
On 9/8/22 17:16, Noel Chiappa wrote:
From: Jim
Capp
See "The Preparation of Programs for an
Electronic Digital Computer",
by Maurice V. Wilkes, David J. Wheeler, and Stanley Gill
Blast! I looked in the index in my copy (ex the Caltech CS Dept Library :-),
but didn't find 'word' in the index!
Looking a little further, Turing's ACE Report, from 1946, uses the term
(section 4, pg. 25; "minor cycle, or word"). My copy, the one edited by
Carpenter and Doran, has a note #1 by them, "Turing seems to be the first
user of 'word' with this meaning." I have Brian's email, I can ask
him how
they came to that determination, if you'd like.
There aren't many things older than that! I looked quickly through the "First
Draft on the EDVAC", 1945 (re-printed in "From ENIAC to UNIVAC", by
Stein),
but did not see word there. It does use the term "minor cycle", though.
Other places worth checking are the IBM/Harvard Mark I, the ENIAC and ...
I guess therer's not much else! Oh, there was a relay machine at Bell, too.
The Atanasoff-Berry computer?
From: "John P. Linderman"
He claims that if you wanted to do decimal
arithmetic on a binary
machine, you'd want to have 10 digits of accuracy to capture the 10
digit log tables that were then popular.
The EDVAC draft talks about needing 8 decimal digits (Appendix A, pg.190);
apparently von Neumann knew that that's how many digits one needed for
reasonable accuracy in differential equations. That is 27 "binary digits"
(apparently 'bit' hadn't been coined yet).
Noel