Hi Steve,
The need for speed was prevalent back around the time of Perl and Tcl's
creation. We could get the work done with sh and awk, but the overhead
of new processes, and moving data between them lots of times, was a
problem for larger workloads.
Larry's view was that if it was useful and the
language didn't already
have it, toss it in.
Perl placed the best ideas of sed and awk inline in the language, and
Larry's implementation put a lot of effort into speed; the bytecode
interpreter, the regexp engine, the hash table, ... The cumulative
effect was very noticeable.
Larry's tossing was quite skilled. For example, the regexp syntax took
all those previous flavours and gave them some consistency:
meta-characters were punctuation and backslashes escaped them whereas
backslashed letters allowed new things like the \d and \s conveniences.
If he hadn't been a dab hand, well, there lies PHP and its
real_kitchen_sink2().
I read Ousterhout's papers on Tcl and Tk, and tried using Tcl for a
while. The `everything is a string' became wearing, and the performance
wasn't there for large amounts of data. Perhaps because it was
considered a glue and you'd link it it with C for the hard bits, but
that was a bit too much effort for scripting that often starts with
something intended to be thrown away.
Yes, Tk's nice, and its Canvas widget with vector objects stood out
against the Athena, Motif, etc., available at the time. The IDs and
tags were very useful. Change an attribute of all things with tag T and
it took care of the redraw. I used it for a digital-circuit simulation
once.
--
Cheers, Ralph.
https://plus.google.com/+RalphCorderoy