On Tue, Jun 7, 2022 at 10:30 AM Theodore Ts'o <tytso(a)mit.edu> wrote:
On Tue, Jun 07, 2022 at 09:28:14AM +1000, David Arnold
wrote:
Lest it be thought that all is sweetness and
light in Linux-land,
there were years of fairly intense competition involved in getting
installers to the point that you can start with a downloaded image,
burn it to a USB, boot it, run it, and (optionally) make it persist
over a reboot, all with very minimal need to understand or care
about the many, many things going on under the hood.
On Sun, Jun 05, 2022 at 09:40:44PM -0400, Dan Cross wrote:
But every distribution has its own installer, and they vary wildly.
The key is I think *competition*. Distributions were competing to
attract a user base, and one of the ways they could do that was by
improving the install experience. There were people who reviewed
distributions based on which one had the better installer, and that
helped users who were Windows refugees choose the ones that had the
better installer.
My point is that this is something that varies from distro to distro;
it is therefore inaccurate to claim that "Linux solved it" since many
different distros that have widely varying installation processes
fall under the very large "Linux" umbrella.
The other advantages of having a many distributions is
that gave more
people to opportunity to exercise leadership --- you can "drive the
big red firetruck" by founding a distro like Debian or Slackware, and
the people who are interested in improving a distribution can be
different from those who drive kernel development. This is one of the
things that I've learned from my rector at my church, who had a
background in community organizing. One of the big differences
between community organizing compared to the corporate world is that
it's more important to give more people --- volunteers ---
opportunities to contribute, and very often this is far more important
than efficiently organizing a corporate-style team to get some job
done. Was it inefficient to have multiple teams competing on
installer development, and release engineering? Sure, but it also
drew more people into the Linux ecosystem.
That's an interesting angle and one that I think bears more on the topic
at hand than many folks are willing to let on: the barrier to contribution is,
in a lot of important ways, lower in the Linux ecosystem than it is in the
BSD world. At least historically speaking, and perhaps still true. Anecdotally,
I was able to get a patch into the KVM unit tests (not precisely Linux but
related) in pretty short order recently while the OpenBSD people simply
ignored my problem report and patch. YMMV.
The ABI
compatibility thing breaks down, too. A colleague was trying
to get the host-side of a Salae logic analyzer working on Arch, and it
doesn't. They more or less require Ubuntu 18.something, and that's
not what he runs. As far as most end-users are concerned, your
distribution of choice is "Linux", and distributions vary in all kinds of
ways.
There are three different things that's worth separating. One is a
consistent kernel<->user space interface, this is what Linus Torvalds
considers high priority when he says, "Thou shalt not break
userspace". This is what allows pretty much all distributions to
replace the kernel that was shipped with the distribution with the
latest upstream kernel. And this is something that in general doesn't
work with *BSD systems.
Eh? I feel like I can upgrade the kernel on the various BSDs
without binaries breaking pretty easily. Then again, there _have_
been times when there were flag days that required rebuilding
the world; but surely externalities are more common here (e.g.,
switching from one ISA to another).
The second is application source-level compatibility,
and this is what
allows you to download some open source application, and recompile it
on different Linux distributions, and it should Just Work. In
practice this works for most Linux and *BSD users.
This, I think, is where things break down. Simply put, the way
people build applications has changed, and "source-level"
compatibility means compatibility with a bunch of third-party
libraries; in many ways the kernel interfaces matter much, much
less (many of which are defined by externally imposed standards
anyway). If a distro ships a too-old or too-new version of the
dependency, then the open source thing will often not build, and
for most end users, this is a distinction without a difference.
And the third is application *binary* level
compatibility. And this
is what is important if you have some binary that you've downloaded,
or some commerical application which you've purchased, and you want to
run it on Linux distribution different from the one which is
originally designed. Static linking solves most of the problems, but
if the user needs to use proprietary/commercial binaries, if they
stick to RHEL, Fedora, Ubuntu/Debian, they will generally not have
issues.
Yup. But then that you're running Linux is mostly immaterial; it could
be Windows and the same would be true.
- Dan C.