On Wed, Mar 08, 2023 at 09:23:56AM -0500, Dan Cross wrote:
By the early 90s this was understood to mean a single-user machine in
a desktop or deskside form factor with a graphics display, and a more
advanced operating system than something you'd get on a consumer-grade
machine. But the term probably predated that. Generally, workstations
were machines marketed towards science/engineering/technology
applications, and so intended for a person doing "work", as opposed to
home computing or large scale business data-processing.
It's perhaps interesting to look at the history of A/UX. In 1988
Apple released a version of Unix based on SVR2. It was massively
criticized for being command-line only (no X windows or any other kind
of GUI).
A year later, they came out with a version with X Windows, which made
it roughly comprable to a low-end Sun Workstation, at a price of $9k.
Given that in addition to the 3M's (1 MIPS, 1 Meg of Memory, and 1
Megapixel graphics display), it also adhered to the "4th M", costing
roughly $10k, or a "Megapenny". :-)
It seems to me that in the late 80's / early 90's, it was pretty clear
what people were expecting if you wanted something more than a "toy"
(err..., sorry, a "home") computer. And Apple wanted to get into that
game and play, too. Of course, they later decided this wasn't a place
where they really could compete, especially since in 1990 Sun released
a low-end Workstation for $5k (the Sparcstation SLC).
And by 1992, you could get a very credible Linux home machine with X
Windows, for about $2k. It's kind of amazing how quickly a personal
Workstation became quite affordable, even for a graduate student or a
new college grad (like me!), out of their own personal checking
account.
- Ted