Hi Clem,
Thanks for you detailed response. I appreciate getting perspectives
from people who fought in the wars I'm too young to remember; I visit
the memorials and try to imagine what things were like.
At 2025-03-09T13:29:55-0400, Clem Cole wrote:
On Sun, Mar 9, 2025 at 9:18 AM G. Branden Robinson
<
g.branden.robinson(a)gmail.com> wrote:
At 2025-03-09T08:29:43-0400, Clem Cole wrote:
On Sat, Mar 8, 2025 at 10:47 PM Dan Cross
<crossd(a)gmail.com> wrote:
Was any consideration for it made?
Only
Ken can answer that. My guess is that Algol-W would have
failed for the same reasons Pascal ultimately failed. It really
was directed at students and lacked many of the tools C had. Many
of the issues exposed in Brian's treatise *"Why is Pascal not my
favorite programming language"* also apply to Algol-W.
I think it's worth noting that, as I undertand it from having read
the paper, old Pascal manuals, and ISO 7185 (and, a long time ago,
experience with Borland Turbo Pascal), most or all of the defects
Kernighan pointed out were addressed by vendored versions of Pascal
and ultimately by its ISO standard form. I think even UCSD Pascal
had extensions, though I don't recall what they were.
Branden — respectfully, if you download the standard
https://archive.org/details/iso-iec-7185-1990-Pascal
I keep a copy on my tablet! Alongside two Free Pascal and the Turbo
Pascal 7.0 manuals. I didn't have to fight in the trenches using Pascal
for very long (and when I did, it was with Borland's generously expanded
dialects, a few years apart).
or look at versions that have been converted to HTML
such as
https://wiki.freepascal.org/Standard_Pascal, that statement is a tad
hollow. Sadly, that is the issue with Pascal.
Okay. I'll take it as homework to double-check how feedble ISO Pascal
is compared to the version critiqued by Kernighan.
Hmm, I can already see that I might have confused ISO Standard Pascal
(ISO 7185) with ISO Extended Pascal (ISO 10206). Both standards, but
only one named "Standard". I apologize for my error.
The fundamental flaws of the core Pascal language
remained as
described in Brian’s paper as:
1. Since the size of an array is part of its type, it is not possible
to write general-purpose routines, that is, to deal with arrays of
different sizes. In particular, string handling is very difficult.
Yeah. That's a brutal flaw that every defender of Wirth Pascal has to
concede at the risk of not being taken seriously.
2. The lack of static variables, initialization, and a
way to
communicate non-hierarchically combine to destroy the ‘‘locality’’ of
a program — variables require much more scope than they ought to.
That's also a flaw, but C doesn't distinguish itself strongly here;
`static` function linkage should have been the default. We should have
had a `public` keyword (or synonym) instead. (Rust, to my great
surprise, seems to have convinced people that `const` default is good
too--hence `let mut`--and it is, for many reasons including concurrency.
Even my beloved Ada was not so forward-looking.)
3. The language's one-pass nature forces
procedures and functions to
be presented in an unnatural order; the enforced separation of various
declarations scatters program components that logically belong
together.
But that was true of C for a long time as well; as I understand it,
one-pass C compilers were important enough that intermixed statements
and declarations were allowed into the standard only in C99. That's a
long time after CSTR #100.
And for me, writing to minimize forward declarations _is_ the natural
order. Just as in documentation, we want to avoid employing concepts
before we've introduced them.
(If your `main()` is either really short or strictly a sequence of calls
to functions returning `void`, such that it reads like a procedural
checklist, only then would I tolerate it at the top of the source file.)
4. The lack of separate compilation impedes the
development of large
programs and makes using external libraries impossible.
This one's about as brutal as point 1. It requires concession.
Modularity is critical.
5. The order of logical expression evaluation cannot
be controlled,
which leads to convoluted code and extraneous variables.
Some of this blows back onto C. Yes, short-circuit logical evaluation
operators are desirable, but apart from that the order of expression
evaluation in C (a) is frequenly not specified so that compilers can
generate fast code for the hardware; or (b) demands that the programmer
remember something like 15 levels of operator precedence, which is
savage cruelty in my opinion.
Here I am recently correcting a dopey mistake of mine along those lines.
https://git.savannah.gnu.org/cgit/groff.git/commit/?id=a0171c9dc12daa8e70b9…
6. The case statement is emasculated because there is
no default
clause.
I'll freely grant this one, too, and go still farther and say that both
Pascal _and_ C should have required Haskell-style exhaustive coverage of
the controlling variable's domain.
C's support for sum types was (and is) bad, and Pascal's started down
the right road but then gave it up partway, to amplify Kernighan's
point.
7. The standard I/O is defective. There is no sensible
provision for
dealing with files or program arguments as part of the standard language
and no extension mechanism.
To my horror, when learning Java I discovered that it had managed to
discard the basic competence of console/terminal I/O supported by C's
stdio library. I don't dispute that C shone here. I think Korn and Vo
have solid critiques of other aspects of stdio, and I find the
irregularity of function names, argument order, and some aspects of
behavior (such as whether an output function appends a newline or not)
annoying, but I would not dispute that it's easier than in most
languages I've dealt with to get a simple "conversational" terminal
application stood up with C. Yes, you often want a "serious" UI in
curses or a windowing system later (or none at all, relying on argv[]
and the standard I/O streams to do everything as in the filter model),
but that's no reason to fumble the ball on simple Teletype-style I/O.
Yet too many languages did.
Still, scanf() was a bad idea, or at least overapplied, and should not
be taught to students as a tool for interactive I/O handling (as it was
to me--did anyone else have this misfortune?). For that, a simple
lexical analyzer is necessary because humans produce undisciplined
input. It's never too early to teach a beginner how to write a finite
state machine.
8. The language lacks most of the tools needed for
assembling large
programs, most notably file inclusion.
Once you have separate compilation, this seems mostly like either (a) a
linker problem or (b) a problem better solved by a real macro language
like m4. The C preprocessor was merely adequate when it could have been
brilliant.
In case anyone feels a dark premonition of where that idea leads: yes,
I'm saying the CSRC should have given us Autoconf instead of the C
preprocessor. >:-) If they had, it would be far less hated.
9. There is no escape.
And that's why everybody added inline assembly and some added foreign
function interfaces.
You >>are<< correct that many (not all) of
the defects Brian and Dave
ran into when updating/rewriting the Software Tools book from FORTRAN
(Ratfor) to Pascal >>could<< have been fixed. The core issue with
Pascal is that if an implementation did add an extension,* every
implementation did it differently.*
Now, the truth is that having different implementations from different
vendors was not unusual. The whole reason ASA developed FORTRAN-66,
which begat the many versions that followed [I believe FORTRAN-2018 is
the last verified version, but I'm told by Dr. FORTRAN [Steve Lionel]
that they are preparing an updated draft. BASIC had it in spades. In
fact, at one of the early 'Hatfield/McCoy" parties between my friends
at HP and Tektronix, we count 14 different flavors of "HP BASIC" and 6
different flavors of "Tek Pascal."
I guess standardization of Pascal started too late; but both C and C++
took a long time to get to their initial standards too. What do you
think accounts for C and C++'s greater success, in the specific sense
that vendor extensions didn't cripple the language's development? What
force kept these languages better governed?
Maybe this is
what it means to have been "directed at
students"--that easy wins in terms of language improvement were not
implemented because covering them topically in a semester-long
university course would not have been practical. If so, that may
make the tragedy of Pascal greater, not smaller.
History. Wirth was teaching at Stanford when Algol-X was proposed. In
the late 1960s, the International Federation for Information
Processing (IFIP) held the International responsibility for the
definition of the programming language ALGOL 60, chartered IFIP
Working Group 2.1 to develop a replacement for the language which had
been identified, notably the lack of a standardized string subsystem.
This effort was dubbed ALGOL X. You can use an internet search to find
several of them, but one of the proposals was submitted by Niklaus
Wirth and Tony Hoare. When their proposal was not accepted, the
language ALGOL 68 would be developed under these auspices.
I've read Lindsey's "A history of ALGOL 68" in HOPL 2 (1996).
Intriguing stuff. I have no strong opinions about any of it, probably
because no one's either forced me to write in ALGOL 68, nor forbidden me
from doing so. I do admire its nearly universal loop structure:
[ for index ] [ from first ] [ by increment ] [ to last ] [ while condition ] do
statements od
"Nearly" because it's not obvious to me how you'd simulate C's
do-while,
which is underrated and under-taught.
And in case anyone's not aware of it, you can play with Algol 68 today.
https://www.theregister.com/2025/01/07/algol_68_comes_to_gcc/
The committee decided that the Wirth/Hoare proposal
was an
insufficient advance over ALGOL 60, and the proposal was published as
a contribution to the development of ALGOL in CACM after making slight
modifications to the language. Wirth eventually decided to develop
his proposal and created an implementation targeting the IBM
System/360 at Stanford University. He called the new language Algol-W
[used in many universities, and I used it at CMU, and I have friends
that used it at Cornell and Princeton].
At the time, Wirth's original target was a tool to teach his students
at Stanford. It was not a "production quality" compiler in the sense
of IBM's FORTRAN-H, which was considered the "best" production
compiler at the time. Also, remember the whole assembler/compiled
code was hardly a settled debate.
That was still a live issue when I was learning, but back then (on home
8-bit micros), you had to shell out money to get anything but the BASIC
interpreter that was stored in ROM. We penurious kids, if pressed,
would hand-assemble code to and poke it into memory to escape the
shackles of the vendor's languid BASIC interpreter (usually under
license from Microsoft). Made a lot of mistakes and froze the machine
that way. No illegal instruction traps and no handler for such traps if
we'd had 'em. Just push reset and hope you'd saved your work.
"a.out?"
https://www.nokia.com/bell-labs/about/dennis-m-ritchie/odd.html
Wulf's findings showed BLISS was as good as the
best PDP-10/PDP-11
programmers, which was still a few years in the future. As I've said
elsewhere, it took enough students trained in the ideas from the Green
Book a few years to get into the world to start to develop the
optimizers we now expect as the norm.
I'd like to talk to some of that cohort about Ada.
Though even
that doesn't explain why Wirth didn't spec Pascal
functions to admit polymorphism (perhaps optionally) in array
length. That's an essential property for all sorts of work, even
that undertaken by students. (Say you want them to write a function
that multiplies matrices. You shouldn't have to specialize the
function by matrix dimensions. The obvious pinch point for a Unix
programmer is massive inconvenience if you want to handle
variable-length strings.)
Again, it was not considered an issue for what they were trying to
solve. Read the IFPS meeting notes.
That shocks me, but okay. I realize I'm probably abusing the term
"polymorphism" here, since that's usually applied to distinguishable
data types. It bluescreens my brain to think that "iterate along this
vector of identically typed elements until reaching a known terminating
value" was regarded as a technique with any hint of the exotic. What am
I missing?
Perusing its
manual some months ago, it occurred to me that if
someone had had the foresight to implement the language core of
Turbo Pascal 7.0, ignoring the libraries/"modules" entirely, and
omitting the crap necessary to work with the x86 segmented memory
model, I don't think Kernighan would have had much, if anything, to
complain about. And, in my opinion, we'd be living in a better
timeline.
Turbo Pascal was a flavor of Pascal that came much later.
Yes, TP 1.0 was late 1983; Wirth's spec paper was 1973, UCSD Pascal
1977, and I can't find a date for UCB Pascal, but...
At the time, UCB Pascal
...Thompson had a hand in writing that one, didn't he? During his
sabbatical that spawned the CSRG? If so, that was 1975.
and UCSD Pascal were the two popular implementations
for students, and
commercial folks used Whitesmiths and, later, OMSI. The first three
were the compilers Brian and PJ used (I believe all were running a V7
Unix, but I'm not positive about the target OS).
FWIW: today's popular Pascal is Free Pascal (
freepascal.org) and it
accepts all of the syntax of Turbo, Object Pascal, and Delphi as
input. But it was too late.
It's still a highly active project, and one I keep an eye on. On the
one hand, ports to RISC-V and WASM, and on the other, fixes to its m68k
backend and TP-style text-mode IDE have all seen commits in the past
month. They also seem to take their documentation seriously, a
significant sign of health in my view.
I have no expectation that it will displace C, but its developers have
my admiration for pushing forward. Maybe that's because, in maintaining
a typesetting system that _isn't_ TeX, I recognize potentially kindred
spirits.
But I guess
Pascal just didn't get good enough, soon enough.
No, it was not agreed upon. Everyone developing a Pascal
implementation was pushing their flavor (sound familiar). Turbo "won"
when they targeted DOS, which, at the time, was being ignored by the
traditional "professional" programmers (DOS was too much of a "toy"
to
be serious. -
Well, the professionals weren't wrong. MS-DOS was a crap OS on a crap
ISA. They were both market placeholders. But we don't work, or live,
in a meritocracy. As ever, the surest route to the head of the line is
to be deposited there at birth by your ancestor.
Minicomputers were the bread and butter). When
PharLap created an OS
extension that allowed for a real 32-bit C for the 386, UNIX and C had
already taken the mantel, and Pascal was losing favor in the DOS
world.
And I wonder how big such a compiler would be,
even if coded
competently.
Look at OMSI, which primarily targeted the PDP-11. It has many of
these extensions [it basically started as one of the Tek Pascal's,
BTW. Jon and I worked with some folks who developed it.]
Oof, for RT-11. I've booted RSX-11--once--on my PiDP-11. It was a
jarring experience. Booting 2.11BSD was headily nostalgic, and took me
_way_ back. I haven't seen a boot sequence like that in a long time.
Chronology
bedevils us in another way. A _lot_ of C language
features its advocates are proud of were not present in 1973 or even
when Kernighan wrote CSTR #100 around 1980.
Please be careful here. The point is that in 1980, Pascal and C, as
described by Brain, C did not have the issues I listed above,
I can't completely agree with you here because as I understand the
critique, C still has _some_ of them _today_. Often not as badly as in
the past, and almost certainly not as bad as the Pascals of the day.
but the popular Pascal implementations being used in
the wild did.
It is an error to substitute the ANSI C of 1989
for the C that
actually existed at the time when asking who'd win in a fight
between C and Pascal taking place in, say, 1977.
I don't since I lived it.
Yes, but I'm not always aiming at you personally. :)
Need to work with Noel to update that page -- that name was because it
came with the original ditroff - which used libS.a to replace Lesk's
portable I/O library.
I was (am) a bit confused because like you I thought "Typesetter C" was
something else, something later, that was described in the 1-page
article in the V7 manual "Recent Changes to C", which introduced (per
its headings) "Structure assignment" and [an] "Enumeration type".
_But_, that's dated November 15, 1978, which seems a tad early for
Kernighan's device-independent troff work.
"...in the spring of 1979, I set about to modify TROFF so that it would
run henceforth without change on a variety of typesetters." (CSTR #97)
London and Reiser's 1979 paper describing their group's port of Unix to
the VAX-11 (I gather from other sources that the CSRC was upset with DEC
and refused to do it) identifies barriers that the C of the day erected
to portable programming. Most of them were addressed by ANSI C.
I had thought that device-independent troff directly drove some changes
to the C language (not just support libraries), but maybe I'm wrong. I
have no idea how the Ritchie compiler or PCC evolved after Seventh
Edition (1979). Apart from a couple of books (Hancock & Krieger, _C
Primer Plus_, and _Notes on the Draft C Standard_, Tom Plum), I have no
insight into 1980s C. The main thing I know about it is that if an
implementation targeted x86, it added `near` and `far` keywords to
accommodate segmented memory and/or DOS-style object file formats. I'd
have learned C (or tried to) a lot younger if I could have afforded a
compiler for it. Instead I had to wait for GCC. And a real computer.
It was less an issue of new compiler language features
and more the
requirement for a new I/O library. Remember, C and BLISS, as examples
are defined without an I/O library in their origin stories. They both
use the local OS's IO interface. C does not formally get an I/O
library until Typesetter C and White Book. Simply, until C moved to
the Honeywell and 360 nd people started to "port" code, that Lesk
started to codify an IO library that could be used regardless of the
local OS. Look at the documentation in V6. You'll see some
interesting like fd = -1 and other artifacts that Mike had in so that
earlier code would recompile and relink. With typesetter C, Dennis
forced a small rewrite of your code to use libS.a.
Well, you _could_ bypass it and use read() and write() directly. Say,
in cat(1), the way Pike intended. ;-)
As I said, I lived this whole war.
I appreciate hearing it and I don't mind being corrected (or challenged
on my questionable taste). I'm a contrarian about C because
triumphalism and other forms of sore winning are not good looks for
engineers. Every time I blunder into Pascal-style pseudocode, I
experience a pleasant jolt of recognition.
Warmest regards,
Branden