On 30 April 2005 at 19:31, Wesley Parish <wes.parish(a)paradise.net.nz> wrote:
And, FWIW, in one of the few GNUs Bulletins I actually
have received, courtesy
of the FSF, RMS (I think) was advising that with the dropping price in
memory, GNU hackers could do without worrying about memory size, when it came
to replicating Unix utilities..
That's been on my mind as I thought back to my days on VAXen
with 100 users logged on and a load average of 30. Back then,
efficient programming was so very important. Now, when the GNU
"cp" has more than 20 options -- and some of those with several
possible arguments -- one side of me thinks how bloated the GNU
utilities seem. But, on the other hand, one of the things I'm
doing in this series of columns is to compare "how we used to
do it" vs. the usefulness of some of the new features. For
instance, back then we could copy a directory tree recursively
with "tar" or "find", carefully handling devices and etc. along
the way. Now we can do the same thing by typing a cp command
with a couple of options. With powerful machines on our desks,
which sort of "efficiency" do we want these days?
I'm not trying to answer that question! I'm trying to show
things in a balanced way and leave it to the reader to decide.
This has been debated and discussed so much over the years
that I can't shed any new light on it. I just want readers to
keep it in mind, think about where we've been and where we are.
Jerry