At 2025-03-02T17:54:01+0000, segaloco via TUHS wrote:
On Sunday, March 2nd, 2025 at 9:29 AM, Bakul Shah via
TUHS
<tuhs(a)tuhs.org> wrote:
C's Achilles' heel is its type system.
No extension can paper over
that. Once you fix that it is no longer C. Newer languages can in
fact be seen as fixing/adding things like proper strings, default
read/write rules, memory allocation rules, modules/packages,
object/closures, etc. over C. This is why I said they are not
particularly outstanding. A lot of it seems rather adhoc.
For me though that speaks to the fact that C leaves a lot to the
imagination. You've got the building blocks of a number of higher
level concepts, but implementation thereof is left to the user. Of
course you have these various extensions pop up, but the base language
does not define the "one true way" to do it, rather, each case can be
engineered with context in mind.
If I'm understanding Bakul's point correctly, then what you're saying is
a defensible claim about many aspects of C but emphatically _not_ of its
type system. The language does _not_ give you the tools to implement a
rigorous type system as, Haskell programmers or type theorists would
conceive of it. I mean, you could write a Haskell compiler in C, but
you wouldn't be able to apply its type logic to C's own type system.
(Maybe you could do it for any types other than the primitive ones
already supported, with sufficient abuse of the preprocessor.)
This rigid inability was baked into C from day one. Everything that can
possibly be represented in a machine register with all bits clear shows
up as an integral zero literal.
'\0' == 0 == nullptr == struct { } == union { }
The CSRC staked a lot on the typeless B, and clung to that heritage.
I don't think it's an accident that C's designers expressed type
_aliasing_ with the keyword "typedef". Types simply were not taken
seriously as reasoning tools. C was a language for people who wanted to
get that crap out of the way so they could think about binary
representations. (Okay, octal.)
C was aggressively oversold, I think more by by its fans than its
designers. Even its reputation as "portable assembler" withstands
scrutiny only as long as all you want to port to are models in the
PDP-11 line. If you think that claim is crack-headed, review London &
Reiser's paper on porting Unix to the VAX-11/780. Even setting aside
the problems with nroff/troff and the Bourne shell, they pointed their
fingers at problems that a "portable assembly language" would already
have considered, like machines having differing alignment requirements,
or having the standard I/O library's binary I/O functions write and
interpret some kind of header in the file expressing basic facts like
the endianness of various integer widths (big, little, or FP-11). Magic
numbers were good enough for object file formats, but not applications
programmers, I guess.
I don't doubt that people could be massively more productive in C than
in PDP-11 assembly language. But that claim doesn't establish anything
that wasn't already known about the utility of higher-level languages.
K&R were still trying to warn people in 1988 that C was not a hammer for
all nails.
"...C offers only straightforward, single-thread control flow: tests,
loops, grouping, and subprograms, but not multiprogramming, parallel
operations, synchronization, or coroutines." (_The C Programming
Language, 2nd edition, p. 2)
I guess we can add a general, flexible algebraic type system to the list
of missing features.
Regards,
Branden