On Sun, Jun 05, 2022 at 08:32:47PM -0400, Theodore Ts'o wrote:
An interesting question is if CSRG had been actively
pushing the state
of the art foreward, would that have provided sufficient centripetal
force to keep the HP/UX, SunOS, DG/UX, etc., from spintering? After
all, it's natural to want to get a competitive advantage over your
competition by adding new features --- this is what I call the "Legacy
Unix value-add disease". But if you can't keep up with the upstream
developments, that provides a strong disincentive from making
permanent forks. For that matter, why was it that successive new
releases of AT&T System V wasn't able to play a similar role? Was it
because the rate of change was too slow? Was it because applications
weren't compatible anyway due to ISA differences? I don't know....
I think I know but my opinion is very Sun centric. Sun was the innovator
of the the time. NFS, RPC, VM system that replaced the buffer cache so
everything was just pages, read/write/map, all the same pages. Sun had
critical mass in talent and they innovated. Everyone else followed
Sun (and hated them for it).
AT&T, other than the original group that did Unix, didn't innovate much
in the Unix domain. SysV was sort of meh compared to SunOS. When Unix
was sort of a skunk works, tons of good stuff happened. When corporate
took it over, it became less interesting. There was some good stuff, I
liked the document work bench stuff, I liked learn(1), it wasn't all bad.
But it wasn't even close to the same pace of innovation that Sun had,
I'm super grateful to have been at Sun while that was happening, that's
a once in a lifetime experience.
AT&T did fund Plan 9, if I remember history correctly, so that was
a step back towards the original Unix innovation team. I really wanted
plan 9 to take over but AT&T kept it away from open source too long and
Linux was too entrenched. Classic Betamax vs VHS, the less good answer
won. Sigh.
So from a computer science point of view, one could
argue that NetBSD
was "better", and that Linux had a whole bunch of hacks, and some
might even argue was written by a bunch of hacks. :-) However, from
the user's perspective, who Just Wanted Their Laptop To Work, the fact
that Linux had some kind of rough PCMCIA support first mattered a lot
more than a "we will ship no code before its time" attitude. And
some of those users would become developers, which would cause a
positive feedback loop.
I think "it just works" is key. Linux was really competing with Windows
and Linus knew that. He could have cared less about *BSD, they just
weren't on his radar screen. But Windows was. Very early versions
of RedHat were good enough on the install that you could just tab your
way through it. I remember being impressed because I was installing a
machine that didn't have a mouse (yet) and I just tabbed my way through
it and it was fine.
Today's FreeBSD install process is like a trip back to 1980. It is
not pleasant. The Linux install process 15 years ago was a better
experience than Windows install process (if you've done the windows
install then get some ethernet dongle that windows recognizes so you can
connect to the internet, then do the driver search and install, install,
install all the drivers - contrast that with Linux where it just has 99%
of the drivers in the kernel).
"It just works" for the win.