One of the best conferences I ever attended was a Usenix conference on
Little Languages, held in Santa Fe, NM. I had the pleasure of going
out for dinner with a group that included Larry Wall and John
Ousterhout, inventors of Perl and TCL, respectively. It was a
fascinating discussion.
Larry looked at Perl as if it were a natural language. Just as
English has borrowed words, phrases and even syntax from other
languages, Larry's view was that if it was useful and the language
didn't already have it, toss it in. John came from a more
mathematical (and, may I say, Unix) point of view. Small is
beautiful, etc. The original motivation for TCL was to allow one
process to dynamically send data to another process and get data back,
so, for example, an application could send editor commands to an
editor that would process them and return the result. Kind of an RPC
at the process level, and, unlike pipes, the connections were
dynamic. I'm not sure that ever worked in a useful way. TCL syntax
was quite clean and easy to learn. But, for a lot of tasks, it
seemed to take a bit more TCL to do that task than Perl did. On the
other hand, when building substantial applications I personally found
TCL much easier to work with -- the semantic clashes between different
parts of Perl were somewhat disorienting to me.
Ironically, of course, it was the TK graphics package of TCL that had
real legs. System administrators found Perl to be a good vehicle
for writing systems administration tools -- kind of like a shell with
useful data structures. That meant that Perl was alive and well on
almost every Unix-oid system out there. At the same time, writing
interactive graphics applications in TK was head and shoulders easier
than any other system out there, and it was very portable between
systems, but it was often harder to write the application that the
graphics was the interface for in TCL than in Perl. And, as I'm sure
Larry would have wanted, the useful TK part migrated slowly but surely
into Perl, and the rest of TCL fell into disuse.
Steve
----- Original Message -----
From: "Ralph Corderoy" <ralph(a)inputplus.co.uk>
To:"Theodore Ts'o" <tytso(a)mit.edu>
Cc:<tuhs@minnie.tuhs.org>
Sent:Sun, 24 Sep 2017 23:54:43 +0100
Subject:Re: [TUHS] Another "craft" discussion topic - mindless tool
proliferation
Hi Ted,
If you take a look at how perl handles its man pages,
with 188 man
pages in section 1:
Yes, I went there the other day looking for something and was
dismayed.
The main reason being I learnt Perl thoroughly back when it was
4.something from its single perl(1) lovingly crafted by Larry Wall.
Given Perl is, no was, an amalgam of Unix programming, this was
perfectly possible to anyone that already new C, sh, sed, awk, grep,
etc., and the man page ran smoothly, assuming all that background
knowledge.
Perl 5 broke this a bit with its addition of classes, and the shift
to
POD; still overseen by Wall. Since then, with Larry's attention
elsewhere, it seems they've revelled in TMTOWTDI and the
documentation
has bloated so it appeared to me, looking for a reference, that there
several possible man pages where it might be. Looking through those,
a
lot of content seem duplicated, but slightly different, and I quickly
gave up.
I find that info system, with its hypertext linking,
to be far more
convenient.
One nice thing about info(1) in the last few years is they've ditched
printing to stderr what bit of formatting they're doing, removing the
need for a `Gg' in `info libc | less' to make it do that work by
seeking
to the end and then erase the clutter by jumping back to the
beginning.
And the question is whether man is a sufficient way to
provide
documentation.
As others have pointed out, it wasn't the sole source of
documentation;
papers for bigger programs used to provide the introduction and
overview
leaving the man page to mop up as a reference.
--
Cheers, Ralph.
https://plus.google.com/+RalphCorderoy