On Sun, Mar 9, 2025 at 9:18 AM G. Branden Robinson <
g.branden.robinson(a)gmail.com> wrote:
At 2025-03-09T08:29:43-0400, Clem Cole wrote:
On Sat, Mar 8, 2025 at 10:47 PM Dan Cross
<crossd(a)gmail.com> wrote:
Was any consideration for it made?
Only Ken can answer that. My guess is that Algol-W would have failed
for the same reasons Pascal ultimately failed. It really was
directed at students and lacked many of the tools C had. Many of the
issues exposed in Brian's treatise *"Why is Pascal not my favorite
programming language"* also apply to Algol-W.
I think it's worth noting that, as I undertand it from having read the
paper, old Pascal manuals, and ISO 7185 (and, a long time ago,
experience with Borland Turbo Pascal), most or all of the defects
Kernighan pointed out were addressed by vendored versions of Pascal and
ultimately by its ISO standard form. I think even UCSD Pascal had
extensions, though I don't recall what they were.
Branden — respectfully, if you download the standard
https://archive.org/details/iso-iec-7185-1990-Pascal or look at versions
that have been converted to HTML such as
https://wiki.freepascal.org/Standard_Pascal, that statement is a tad
hollow. Sadly, that is the issue with Pascal. The fundamental flaws of the
core Pascal language remained as described in Brian’s paper as:
1. Since the size of an array is part of its type, it is not possible to
write general-purpose routines, that is, to deal with arrays of different
sizes. In particular, string handling is very difficult.
2. The lack of static variables, initialization, and a way to communicate
non-hierarchically combine to destroy the ‘‘locality’’ of a program —
variables require much more scope than they ought to.
3. The language's one-pass nature forces procedures and functions to be
presented in an unnatural order; the enforced separation of various
declarations scatters program components that logically belong together.
4. The lack of separate compilation impedes the development of large
programs and makes using external libraries impossible.
5. The order of logical expression evaluation cannot be controlled, which
leads to convoluted code and extraneous variables.
6. The case statement is emasculated because there is no default clause.
7. The standard I/O is defective. There is no sensible provision for
dealing with files or program arguments as part of the standard language
and no extension mechanism.
8. The language lacks most of the tools needed for assembling large
programs, most notably file inclusion.
9. There is no escape.
My grasp of the chronology is poor, though, and
I'd concede that Wirth's
Pascal of his 1973 memorandum wears most or all of the mud Kernighan
threw at it. For the most part, solving the problems Kerngihan observed
did not require doing violence to the language (in other words, they
could be solved with extensions rather than by altering the semantics of
valid 1973 Pascal syntax).
You >>are<< correct that many (not all) of the defects Brian and Dave
ran
into when updating/rewriting the Software Tools book from FORTRAN (Ratfor)
to Pascal >>could<< have been fixed. The core issue with Pascal is that if
an implementation did add an extension,* every implementation did it
differently.*
Now, the truth is that having different implementations from different
vendors was not unusual. The whole reason ASA developed FORTRAN-66, which
begat the many versions that followed [I believe FORTRAN-2018 is the last
verified version, but I'm told by Dr. FORTRAN [Steve Lionel] that they are
preparing an updated draft. BASIC had it in spades. In fact, at one of the
early 'Hatfield/McCoy" parties between my friends at HP and Tektronix, we
count 14 different flavors of "HP BASIC" and 6 different flavors of "Tek
Pascal."
Maybe this is what it means to have been "directed at students"--that
easy wins in terms of language improvement were not implemented because
covering them topically in a semester-long university course would not
have been practical. If so, that may make the tragedy of Pascal
greater, not smaller.
History. Wirth was teaching at Stanford when Algol-X was proposed. In the
late 1960s, the International Federation for Information Processing (IFIP)
held the International responsibility for the definition of the programming
language ALGOL 60, chartered IFIP Working Group 2.1 to develop a
replacement for the language which had been identified, notably the lack of
a standardized string subsystem. This effort was dubbed ALGOL X. You can
use an internet search to find several of them, but one of the proposals
was submitted by Niklaus Wirth and Tony Hoare. When their proposal was not
accepted, the language ALGOL 68 would be developed under these auspices.
The committee decided that the Wirth/Hoare proposal was an insufficient
advance over ALGOL 60, and the proposal was published as a contribution to
the development of ALGOL in CACM after making slight modifications to the
language. Wirth eventually decided to develop his proposal and created an
implementation targeting the IBM System/360 at Stanford University. He
called the new language Algol-W [used in many universities, and I used it
at CMU, and I have friends that used it at Cornell and Princeton].
At the time, Wirth's original target was a tool to teach his students at
Stanford. It was not a "production quality" compiler in the sense of IBM's
FORTRAN-H, which was considered the "best" production compiler at the time.
Also, remember the whole assembler/compiled code was hardly a settled
debate. Wulf's findings showed BLISS was as good as the best PDP-10/PDP-11
programmers, which was still a few years in the future. As I've said
elsewhere, it took enough students trained in the ideas from the Green Book
a few years to get into the world to start to develop the optimizers we now
expect as the norm.
Though even that doesn't explain why Wirth didn't spec Pascal functions
to admit polymorphism (perhaps optionally) in array length. That's an
essential property for all sorts of work, even that undertaken by
students. (Say you want them to write a function that multiplies
matrices. You shouldn't have to specialize the function by matrix
dimensions. The obvious pinch point for a Unix programmer is massive
inconvenience if you want to handle variable-length strings.)
Again, it was not considered an issue for what they were trying to solve.
Read the IFPS meeting notes.
Perusing its manual some months ago, it occurred to me that if someone
had had the foresight to implement the language core of Turbo Pascal
7.0, ignoring the libraries/"modules" entirely, and omitting the crap
necessary to work with the x86 segmented memory model, I don't think
Kernighan would have had much, if anything, to complain about. And, in
my opinion, we'd be living in a better timeline.
Turbo Pascal was a flavor of Pascal that came much later. At the time, UCB
Pascal and UCSD Pascal were the two popular implementations for students,
and commercial folks used Whitesmiths and, later, OMSI. The first three
were the compilers Brian and PJ used (I believe all were running a V7 Unix,
but I'm not positive about the target OS).
FWIW: today's popular Pascal is Free Pascal (
freepascal.org) and it
accepts all of the syntax of Turbo, Object Pascal, and Delphi as input. But
it was too late.
But I guess Pascal just didn't get good enough, soon enough.
No, it was not agreed upon. Everyone developing a Pascal implementation
was pushing their flavor (sound familiar). Turbo "won" when they targeted
DOS, which, at the time, was being ignored by the traditional
"professional" programmers (DOS was too much of a "toy" to be serious.
-
Minicomputers were the bread and butter). When PharLap created an OS
extension that allowed for a real 32-bit C for the 386, UNIX and C had
already taken the mantel, and Pascal was losing favor in the DOS world.
And I wonder how big such a compiler would be, even if coded
competently.
Look at OMSI, which primarily targeted the PDP-11. It has many of these
extensions [it basically started as one of the Tek Pascal's, BTW. Jon and
I worked with some folks who developed it.]
Chronology bedevils us in another way. A _lot_ of C language features
its advocates are proud of were not present in 1973 or even when
Kernighan wrote CSTR #100 around 1980.
Please be careful here. The point is that in 1980, Pascal and C, as
described by Brain, C did not have the issues I listed above, but the
popular Pascal implementations being used in the wild did.
It is an error to substitute the ANSI C of 1989 for
the C that actually
existed at the time when asking
who'd win in a fight between C and Pascal taking place in, say, 1977.
I don't since I lived it.
Need to work with Noel to update that page -- that name was because it came
with the original ditroff - which used libS.a to replace Lesk's portable
I/O library. It was less an issue of new compiler language features and
more the requirement for a new I/O library. Remember, C and BLISS, as
examples are defined without an I/O library in their origin stories. They
both use the local OS's IO interface. C does not formally get an I/O
library until Typesetter C and White Book. Simply, until C moved to the
Honeywell and 360 nd people started to "port" code, that Lesk started to
codify an IO library that could be used regardless of the local OS. Look
at the documentation in V6. You'll see some interesting like fd = -1 and
other artifacts that Mike had in so that earlier code would recompile and
relink. With typesetter C, Dennis forced a small rewrite of your code to use
libS.a.
As I said, I lived this whole war.
Crumudgingly yours,
Clem
ᐧ