From: Jim Capp
See "The Preparation of Programs for an
Electronic Digital Computer",
by Maurice V. Wilkes, David J. Wheeler, and Stanley Gill
Blast! I looked in the index in my copy (ex the Caltech CS Dept Library :-),
but didn't find 'word' in the index!
Looking a little further, Turing's ACE Report, from 1946, uses the term
(section 4, pg. 25; "minor cycle, or word"). My copy, the one edited by
Carpenter and Doran, has a note #1 by them, "Turing seems to be the first
user of 'word' with this meaning." I have Brian's email, I can ask him
how
they came to that determination, if you'd like.
There aren't many things older than that! I looked quickly through the "First
Draft on the EDVAC", 1945 (re-printed in "From ENIAC to UNIVAC", by
Stein),
but did not see word there. It does use the term "minor cycle", though.
Other places worth checking are the IBM/Harvard Mark I, the ENIAC and ...
I guess therer's not much else! Oh, there was a relay machine at Bell, too.
The Atanasoff-Berry computer?
From: "John P. Linderman"
He claims that if you wanted to do decimal arithmetic
on a binary
machine, you'd want to have 10 digits of accuracy to capture the 10
digit log tables that were then popular.
The EDVAC draft talks about needing 8 decimal digits (Appendix A, pg.190);
apparently von Neumann knew that that's how many digits one needed for
reasonable accuracy in differential equations. That is 27 "binary digits"
(apparently 'bit' hadn't been coined yet).
Noel