I would add one more important impact: making data a first class component of the computing environment.

With the notion of pipes it became possible to operate on data quickly and flexibly.  There was nothing new from a fundamental capability point of view, but the ease with which one could construct pipelines enabled rapid experimentation and encouraged the development of pipe-able components to add to the tool set.

I may not have articulated this as well as it deserves.

Marc


On Wed, Dec 4, 2024 at 3:43 AM <sjenkin@canb.auug.org.au> wrote:
I was looking at the question of “impact of Unix" and thought that:

        initiating (Portable) Open Source Software including the BSD TCP/IP & Berkeley sockets libraries,
        the Unix -> Minix -> Linux -> Android sequence
        and BSD -> NeXtstep -> OS/X -> iOS sequence,
        plus running the Top 500 supercomputers and most of the Top 500 websites,
        including the infrastructure for trillion dollars companies, Facebook, Amazon (Netflix uses them), and Google
        plus so many embedded Linux / NetBSD based appliances
        even going into space - on small experiments or driving SpaceX’s Falcon 9 & presumably Starship,

would be a slam-dunk for “really high impact”
        - dominating everywhere important, except Windows desktops.

Unix wasn’t just a ’research project’, it was the result of years of work by a team of very capable, professional programmers,
who weren’t looking for kudos or recognition, nor trying to add every conceivable ‘feature’ possible, but the inverse:

         how _small_ could it be and still be enough.

When it left the Labs, Unix wasn’t just “Performant”, it came with a very useful set of tools for creating other tools (‘developing’)
and while the kernel wasn’t perfect (some ‘panic’s), it was of “Production Quality”.

For v6, I believe there were patches for 50-100 bugs circulating, perhaps the first demonstration of “no bug is intractable with ‘many eyeballs’”.

All that in under 5 years of ‘development’, with the “initial release” stable & performant enough for cash-strapped Universities
to gamble their reputations & budgets on running it.
Imagine the grief academics would’ve got if their basic teaching systems failed continuously!

This adoption path pushed Unix through another barrier:

         ’Security’ - with a lot of bright, bored & curious students banging on it as hard as they could for bragging rights.

How long, in releases or years, did it take for other O/S’s to hit the “very stable” benchmark?

I don’t know enough of Linux to answer that definitively, the *BSD’s grew there through usage and contribution,
while Microsoft NT derivates widely suffered “Blue Screen of Death” for years.

Even now, MS-Windows has serious Security / compromise issues, like the highly visible, global “Crowdstrike” event.
Not a break-in or takeover, but an own-goal from Security perimeter control.

==========

I now think Unix has had a much larger, direct and important impact

        - the C language and associated tools & libraries
                 that begat modern toolchains and endless portability across platforms.

In 1991, Bill Plauger had a year sabbatical at UNSW in Sydney,
and happened to say :
        “C is wallpaper - people expect it everywhere”.

C gained formal recognition with the POSIX standard, satisfying conservative users / enterprises that it wasn’t the work of a bunch of Hippies or ill-disciplined Hackers.

Even Microsoft wrote 32-bit Windows NT in C, I presume starting by writing it’s own compiler and toolchain to start.

Borland, Watcom and many others - including Microsoft - offered (Visual) C compile & build environments for Windows,
directly responsible for creating the ’shrink-wrap’ third party software market that drove sales of Windows and x86 machines.

Nobody had seen a market for a billion systems before, nor sold 300M+ CPU’s in a single year.

People don’t buy Silicon Chips or nice Boxes, they buy Applications that solve their problems:

        Software drives Sales of Hardware
         - something that IBM deeply understood first with first the 1401 line, then 360-series.


The other ’small’ achievement of C and Unix was creating the market for RISC chips.
MIPS in the mid-1980’s was only able to design and build the first commercial RISC chip
because it knew it could port Unix to it and find an immediate market
        - not at zero-cost, but a tiny fraction of what every other Vendor had done before
                reinventing the wheel from scratch to provide incompatible O/S & tools for their hardware.

Unix on MIPS not only came with a host of proven software, that a large pool of people knew how to use,
but it arrived as “Production Quality” - the porting team had to test their parts - compiler, linker, libraries - hard, but could trust the existing high-quality codebase.

In "A New Golden Age for Computer Architecture”, 2019 by Hennessy & Patterson,
make an aside:

        In today's post-PC era, x86 shipments have fallen almost 10% per year since the peak in 2011,
        while chips with RISC processors have skyrocketed to 20 billion.

        Today, 99% of 32-bit and 64-bit processors are RISC.

i suggest this goes back to PCC followed by the MIPS R2000 - made possible by Dennis’ C language.

The 1977 invention of ‘pcc’ and rewriting of Unix for cross-machine portability was the first time I’m aware of this being done.
 ( Miller @ UoW did a one-off hack, not to devalue his work, he ported, didn’t invent a multi-target portable compiler )

One of the effects of “portable C” was creating whole new industries for third party software developers
or enabling niche products, like CISCO routers and the many embedded devices.

C and Unix came with the tools to create new languages and new tools.
AWK, sed (!) and shells are obvious examples, with Perl, Python & PHP very big in Internet of 2000.

C was a new class of language - a tool to create tools.
It creates a perfect mechanism to bootstrap any new language, tool or product,
allowing to be refined & used enough to become reliable before being made self-hosting.

Very widely used languages such as Python are written in C.
ORACLE achieved its market dominance by providing ‘portability’ - exactly the same on every platform.
Underpinned by portable C.

The original 1127 team went on to create other systems and languages,
not the least being a new Software Engineering tool, “Go” / golang,
addressing a whole slew of deficiencies in the C/C++ approach and

We’d have no Internet today without Apache written in C and being ported to every environment.

Also, there’s a connection between C and ‘modern’ Software Engineering
  - distributed Repositories, automated builds & regression tests, and the many toolchains and tools used.

They tended to be built in C to address problems (Open Source) developers were finding with existing toolchains.
‘make’ arose at Bell Labs to automate builds, along with PWB and Writers Workbench.

There’s two questions / observations about 50 years of C in broad use:

        - Just how much C is out there and used ‘in production’?

        - C is ‘obviously’ a product of the 1970’s, not reflecting needs of modern hardware, networks, storage and systems,
                but _what_ can replace it?

                There is simply too much critical code written in C to convert it to another ‘better, modern’ language.
                Any new language that is a simple 1:1 rewriting of C cannot address any of the deficiencies,
                while any incompatible language requires redesign and reimplementation of everything - an unachievable goal.

The Linux Kernel’s “rust” project shows the extent of the problem
   - even with the best team of the best developers, its a mammoth undertaking, with uncertain payoffs, unquantifiable effort & deadlines.

My thesis is that portable, standard C:

        - not only co-evolved with other tools & needs to create the Modern Software Engineering environment, the basis for multiple Trillion dollar enterprises (FAANG)
but
        - drove the biggest, most profitable software market ever seen (Wintel)
        - which drove sales volume of x86 chips (& DRAM, motherboards, LAN, GPU, monitors, peripherals…) over 2-3 decades,
        - which drove Silicon Valley, paying for new generations of Fabs and lowering chip prices further & further
        - and eventually created the Fabless RISC CPU company,
                which in the Post-PC era absolutely dominates chip sales.

No Software, no Silicon…

Gordon Moore, in an early comment on his 1968 startup with Robert Noyce, said:

        “we are the real revolutionaries" (vs Hippies & 1967 Summer of Love).

I think Ken & Dennis [ and 1127/ Bell Labs folk ] can say the same.

==========

I’ve written some notes, with links to Programming Languages, especially Jean Sammet’s Histories,
and would like some critiques, suggestions & corrections if people have time and interest.

Unix and C are intimately related - neither was possible or useful without the other.

i think there’s an interesting article in there, but I’m not sure I have what it takes to write it, not in a finite time :)
Very happy to help anyone who does!

Did-C-lang-create-modern-software-industry.txt
        <https://drive.google.com/file/d/1k936sgqHc-vHBvfCdLoSxFhdT9NaijU2/view?usp=sharing>

steve jenkin
04 - dec - 2024

==========
--
Steve Jenkin, IT Systems and Design
0412 786 915 (+61 412 786 915)
PO Box 38, Kippax ACT 2615, AUSTRALIA

mailto:sjenkin@canb.auug.org.au http://members.tip.net.au/~sjenkin