Hi Doug!
On 1/3/23 16:08, Douglas McIlroy wrote:
segaloco via
TUHS writes:
> I think that's a good point that scripting problems may be
> a symptom of the nature of the tools being used in them.
I think that you're hinting at something
different.
To the best of my recollection, scripting
languages were originally
intended and used for the automation of repetitive personal tasks;
making it easier for users who found themselves typing the same
stuff over and over again.
Indeed!
Well, as time goes by, I'm also writing less and less programs. But only
because I find that I can pipe programs together to do what I want without
having to write a new one. I guess that's a fair reason to not compile :)
Somewhere along the line people forgot
how to use a compiler and began writing large systems in a variety
of roughly equivalent but incompatible interpreted languages. Can
one even boot linux without having several different incompatible
versions of Python installed today? So I don't think that it's the
nature of the tools; I think that it's people choosing the wrong
tools for the problems that they're trying to solve.
Jon
The forgotten compilers were typically used to supply glue
to paste major tools together. The nature of that glue---often
simple data reformatting--inspired tools like sed and awk.
Each use of a tool became a process that saved many minutes
of work that would in a scriptless world be guided by hand,
boringly and unreliably.
Yet glue processes typically did only microseconds of
"real" work. In the name of efficiency, the operations began
to be incorporated directly into the shell. The first
inklings of this can be seen in "echo" and various forms
of variable-substitution making their way into the v7
shell. The phenomenon proliferated into putting what were
typically canned sed one-liners (but not sed itself) into
the shell.
Lots of specializations crowded out universality. A side
effect was an explosion of knowledge required to write
or understand code. Such is the tragedy of "forgetting
compilers".
And a funny thing:
Today in the era of multi-core computers, not only the shell-only scripts are
unreadable, but they also are slower than scripts with pipes and little-to-no
shell features.
I tend to write scripts with only pipes. No shell features. And I also avoid
programs with many options (find(1)) when I can split it with xargs(1) and others.
I remember some script I wrote for maintenance of the Linux man-pages, and I
received a suggestion that I could "simplify" the script considerably and make
it "faster" by reducing the number of pipes using some features of find(1) or
sh(1) (I don't remember well). Well, I tested, and my long list of piped
programs outperformed by a fair amount the suggested alternative.
I didn't care enough to find out the reason, but I thought it could be because
with the pipeline, I could run each small process in a different core, so they
could work all at the same time. With a single program invocation, you're
bottle-necked by that program, which is limited to a single core (normally).
Cheers,
Alex
Doug