[self-follow-up]
At 2024-09-28T11:58:16-0500, G. Branden Robinson wrote:
malloc was in
v7, before the C standard was written. The standard
spinelessly buckled to allow malloc(0) to return 0, as some
implementations gratuitously did.
What was the alternative? There was no such thing as an exception, and
if a pointer was an int and an int was as wide as a machine address,
there'd be no way to indicate failure in-band, either.
While I'm making enemies of C advocates, let me just damn myself further
by suggesting an answer to my own question.
The obvious and correct thing to do was to have any standard library
function that could possibly fail return a structure type instead of a
scalar. Such a type would have two components: the value of interest
when valid, and an error indicator.[1]
As John Belushi would have said at the time such design decisions were
being made, "but nooooooooo". Returning a struct was an obviously
HORRIBLE idea. My god, you might be stacking two ints instead of one.
That doubles the badness! Oh, how we yearn for the days of the PDP-7,
when resources were so scarce that a system call didn't return
_anything_. If it failed, the carry flag was set. "One bit of return
value ought to be enough for anyone," as I hope Ken Thompson never said.
Expounders of Doug's tenet would, or should, have acknowledged that by
going to the library _at all_, they were giving up any O(1) guarantee,
and likely starting something O(n) or worse in time and/or space. So
what's so awful about sticking on a piece of O(1) overhead? In the
analysis of algorithms class lecture that the wizards slept through, it
was pointed out that only the highest-order term is retained. Well, the
extra int was easy to see in memory and throw a hissy fit about, and I
suppose a hissy fit is exactly what happened.
Much better to use a global library symbol. Call it "errno". That's
sure to never cause anyone any problems with reentrancy or concurrency
whatsoever. After all:
"...C offers only straightforward, single-thread control flow: tests,
loops, grouping, and subprograms, but not multiprogramming, parallel
operations, synchronization, or coroutines." (K&R 2e, p. 2)
It's grimly fascinating to me now to observe how many security
vulnerabilities and other disasters have arisen from the determination
of C's champions to apply it to all problems, and with especial fervor
to those that K&R explicitly acknowledged it was ill-suited for.
Real wizards, it seems, know only one spell, and it is tellingly
hammer-shaped.
Regards,
Branden
[1] Much later, we came to know this (in slightly cleaner form) as an
"option type". Rust advocates make a big, big deal of this. Only
a slightly bigger one than it deserves, but I perceive a replay of
C's cultural history in the passion of Rust's advocates. Or maybe
something less edifying than passion accounts for this:
https://fasterthanli.me/articles/the-rustconf-keynote-fiasco-explained