On Sep 16, 2021, at 4:54 PM, David Arnold <davida(a)pobox.com> wrote:
And it’s not just those applications: to have your new OS be useful, you need to support
a dozen languages, a hundred protocols, thousands of libraries … a vast range of stuff
that would take years, perhaps decades, to port over or reinvent in your new paradigm.
The idea that you’d turn your back on the accumulated value of 50 years of countless
people’s work because your set of system calls is slightly better than the one you’ve got
now … that’s a very, very big call.
So I think the notion that “the kids” are less willing to understand, or to drill deep,
is doing them a disservice. They do understand, and they (mostly) make the choice to
leverage that body of work rather than embark on the futility of starting afresh.
I have mixed feelings about this. Unix didn't "throw away"
the mainframe world of computing. It simply created a new
ecosystem, more suited for the microprocessor age. For IBM it
was perhaps the classic Innovator's Dilemma. Similarly now we
have (mostly) the Linux ecosystem, while the actual hardware
has diverged a lot from the C memory model. There are
security issues. There is firmware running on these system
about which the OS knows nothing. We have processors like
Esperanto Tech's 1088 64 bit Risc-V cores, each with its own
vector/tensor unit, 160MB onchip sram and 23.8B transistors
but can take only limited advantage of it. We have super
performant GPUs but programming them is vendor dependent and
a pain. If someone can see a clear path through all this,
and create a new software system, they will simply generate a
new ecosystem and not worry about 50 years worth of work.