On 2020-May-17 16:08:26 -0400, Clem Cole <clemc(a)ccc.com> wrote:
On Sun, May 17, 2020 at 12:38 PM Paul Winalski
<paul.winalski(a)gmail.com>
wrote:
> Well, the function in question is called getchar(). And although
> these days "byte" is synonymous with "8 bits", historically it
meant
> "the number of bits needed to store a single character".
8-bit bytes, 32/64-bit "words" and 2's complement arithmetic have been
"standard" for so long that I suspect there are a significant number of
computing professionals who have never considered that there is any
alternative.
Yep, I think that is the real crux of the issue. If
you grew up with
systems that used a 5, 6, or even a 7-bit byte; you have an appreciation of
the difference.
I've used a 36-bit system that supported 6 or 9-bit bytes. IBM Stretch even
supported programmable character sizes.
DEC was still sort of transitioning from word-oriented
hardware (a lesson,
Paul, you and I lived through being forgotten a few years later with
Alpha);
The Alpha was byte addressed, it just didn't support byte operations on
memory (at least originally). That's different to word-oriented machines
that only supported word addresses. Supporting byte-wide writes at
arbitrary addresses adds a chunk of complexity to the CPU/cache interface
and most RISC architectures only supported word load/store operations.
--
Peter Jeremy