Matt,
I take a small stab at this. Like most of us, I don't know the exact
reason, but having lived the time, I'll point out a few things.
1.) At the start (60s and 70s), I suspect that economics drove light pixels
on dark backgrounds as the high-order bit.
2.) Xerox PARC developed Alto was driven by their research in the
Electronic Office -- remember Xerox made its money selling coping
>paper<<. The black-on-white was a specific
choice by their researcher as
they tried to convince their management of the idea.
3.) High-resolution monitors were costly until the late 1980s (regardless
of BW or Color)
4.) Early phosphors tubes suffered from burn, so turning on
display "pixels" for long times was bad. That said, TV was constantly
changing so it was less of an issue for them, but not for terminals where
the dots were the same part of the screen over and over.
5.) "Glass Terminal" designed until the later 1970s were SSI/MSI TTL, with
few if any VLSI except for maybe the WD1402A
<https://streaklinks.com/BjB7EOaVlbWFCqrlggbSUziX/https%3A%2F%2Fspectrum.ieee.org%2Fchip-hall-of-fame-western-digital-wd1402a-uart>
UART
<https://streaklinks.com/BjB7EOa5ed7SuU43lgrt-Chq/https%3A%2F%2Fspectrum.ieee.org%2Fchip-hall-of-fame-western-digital-wd1402a-uart>
6.) Memory costs per bit compared to today are still high. Remember in
1980, when the CMU "SPICE" proposal came out for the infamous 3M system, we
priced the cost of 1MByte of memory (only) which it needed (using
Tektronix's volume pricing) at > $3K [BTW: this was the same year that Jake
Grimes stood on a take at the Asilomar Microprocessor Workshop and declared
memory as being "free" - and compared just a few years previous -- it was].
I observe a few things with those points as a place to start. If you look
at the early "glass ttys" like the DEC VT05 and even later the LSI ADM3A -
there is nary a microprocessor inside. It's a huge board with lots of TTL
[the ADM 3A often came as a kit - you had to solder them yourself]. The
other thing to remember, in those days, NTSC in the US and PAL in Europe
for TVs was the primary driver for CRTs. So if you were making a display,
you had to at least buy the tube from one of a small number of tube
manufacturers [IIRC Phillps in the EU was the leader, and GE, RCA, and
Raytheon fought it out in the US -- Sony would come later] - (I'm also not
sure Magnovox made its own tubes).
For instance, I believe DEC bought the tube for the VT05 from Raytheon; who
made them locally ??Lowell, MA maybe?? and continued for a while [maybe
even through the VT-100].
So remember, for a 25x80 terminal -- that's 2KBytes of memory just for the
video [without "attributes"]. So that's also big. IIRC, the VT05, and
ADM 3A used early Intel 1103 1Kx1 DRAM. So the eight memory chips are the
highest cost part of the logic board.
Because of the design, I suspect the turn-on-the-beam logic for a 'dot
time" was all the designers cared about. Light on dark fell out of the
ease of design, and they had limited BW on the tubes. Even with that, I
believe the VT05 was in the $3-5K range in the late 1960s when it was sold
for the PDP-8 or the like. I remember in the late 1970s when the $1K
glass TTY (the cost of the ADM 3A kit) or the Pekin Elmer "Fox" terminals
appeared.
So between tubes and logic, it took at least ten years to drive the price
down by a factor of 3-5.
My friend and former cubical mate at Tektronix, Roger Bates designed the
display in the Alto [side tidbit - he has the patent on the loadable
curser - which was initially a martini glass, not an hourglass to show
time]. Roger told me the monitor they used was a "special order" and was
fairly expensive. But it was a definite choice to do black on white --
they wanted to represent paper. FWIW: a great deal of the monitor logic
is done in microcode [the infamous BITBLT being an example] because they
were already logic constrained. He and Thacker were using huge boards for
the processor, and it was all SSI/MSI.
*I think it's safe to suggest that Xerox was where the idea/first use of
dark on light began.*
FWIW, in 1979/80, when he and I were working on Magnolia at Tektronix,
Roger had to get the tube from the Sony/Tektronix folks -- it was a special
order. Tek itself did not make one that was high enough BW. Roger had
just finished designing the 3D frame buffer for Teklabs and had used a
Sony/Tek Trinitron color tube in that system - which I remember was one of
the most expensive parts of the FB. Roger used its BW cousin for Magnolia,
which was cheaper, but the tube and hard disk were the two most expensive
parts in Magnolia.
Roll the clock forward only 2-5 years. When Apollo, Masscomp, and later
Sun started to make workstations, there tended to be three types of display
-- a low-resolution BW, a 'paper white" high resolution, and eventually a
color tube.
Also in the late 70s, Motorola created the 6845 video chip, which along
with a micro such as a 6502/6800/Z80, became the de jure standard for most
terminals. It. and 8 2102's SRAM chips, and you had a simple (white on
dark) display that worked with low-end tubes.
Also, the displays were pretty expensive when IBM released the first VGA
for its PC/AT. It took the VGA market taken off to start to drive the
cost of the monitors down. But anything over 12-15 inches was still pretty
expensive, and you needed VRAM to drive it, *etc*.
My point is that Black on White does not take off with hockey stick-style
growth until after the "workstations." FWIW: the 1980s Mac original
display is small and not extremely high resolution compared to what
would quickly come to expect. So while people liked the Xerox idea of
blank on white, it was not economical.
I personally did not get to start using the 'paper' paradigm until the time
of the Sun-3 and like (~1985/6). As an engineer, I also remember having
the default display resolution - we had more program memory, *etc*., but
the tech writer would get a high-end black and white because they were
working with text [*i.e*., Framemaker pages] for documents.
It was in the mid-1990s that having a solid color display with high
resolution became the default. But the cost of the silicon to drive it had
to come down, and the market for high-end displays needed to appear.
BTW: what happened? LCD came out --- why it used Silicon manufacturing
techniques. So once it was perfected, the ability to make a high BW
display quickly overtook the analog tube schemes.
As for the current light on dark, I wonder if this is just a new set of
engineers making their mark. I'm sure it's better. The cost is the same,
so now it's just marketing and a way to show off being different - *e.g.*,
new/cool.
ᐧ
On Thu, Jun 15, 2023 at 4:56 PM segaloco via COFF <coff(a)tuhs.org> wrote:
Good afternoon everyone. I've been thinking about
the color/contrast
landscape of computing today and have a bit of a nebulous quandary that I
wonder if anyone would have some insight on.
So terminals, they started as typewriters with extra steps, a white piece
of paper on a reel being stamped with dark ink to provide feedback from the
machine. When video terminals hit the market, the display was a black
screen with white, orange, green, or whatever other color of phosphor they
bothered to smear on the surface of the tube. Presumably this display style
was chosen as on a CRT, you're only lighting phosphor where there is
actually an image, unlike the LCD screens of today. So there was a complete
contrast shift from dark letters on white paper to light letters on an
otherwise unlit pane of glass.
Step forward to graphical systems and windows on the Alto? Light
background with dark text.
Windows on the Macintosh? Light background with dark text.
Windows on MS Windows? Light backgrounds with dark text.
Default HTML rendering in browsers? Light backgrounds with dark text.
Fast forward to today, and it seems that dark themes are all the rage,
light characters on an otherwise dark background. This would've made so
much sense during the CRT era as every part of the screen representing a
black pixel is getting no drawing, but when CRTs were king, the predominant
visual style was dark on light, like a piece of paper, rather than light on
dark, like a video terminal. Now in the day and age of LCDs, where every
pixel is on regardless, now we're finally flipping the script and putting
light characters on dark backgrounds, long after any hardware benefit (that
I'm aware of) would be attained by minimizing the amount of "lit surface"
on the screen.
Anyone know if this has all been coincidental or if the decision for
graphical user interfaces and such to predominantly use white/light colors
for backgrounds was a relatively intentional measure around the industry?
Or is it really just that that's how Xerox's system looked and it was all
domino effect after that? At the end of the day I'm really just finding
myself puzzling why computing jumped into the minimalism seen on terminal
screens, keeping from driving CRTs super hard but then when GUIs first
started appearing, they didn't just organically align with what was the
most efficient for a CRT. I recognize this is based largely in subjective
views of how something should look too, so not really expecting a "Person
XYZ authoritatively decided on <date> that GUI elements shall
overwhelmingly only be dark on light", just some thoughts on how we got
going down this path with color schemes in computing. Thanks all!
- Matt G.