So people have called me on the claim that lisp is not fast. Here's a
rebuttal.
Please write a clone of GNU grep in lisp to demonstrate that the claim
that lisp is slower that C is false.
Best of luck and I'll be super impressed if you can get even remotely
close without dropping into C or assembler. If you do get close, I
will with draw my claim, stand corrected, point future "lisp is slow"
people at the lisp-grep, and buy you dinner and/or drinks.
--lm
> From: Larry McVoy <lm(a)mcvoy.com>
> the proof here is to show up with a pure lisp grep that is fast as the C
> version. ... I've never seen a lisp program that out performed a well
> written C program.
Your exact phrase (which my response was in reply to) was "lisp and
performance is not a thing". You didn't say 'LISP is not just as fast as C' -
a different thing entirely. I disagreed with your original statement, which
seems to mean 'LISP doesn't perform well'.
Quite a few people spent quite a lot of time making LISP compiler output fast,
to the point that it was possible to say "this compiler is also intended to
compete with the S-1 Pascal and FORTRAN compilers for quality of compiled
numeric code" [Brooks,Gabriel and Steele, 1982] and "with the development of
the S-1 Lisp compiler, it once again became feasible to implement Lisp in Lisp
and to expect similar performance to the best hand-tuned,
assembly-language-based Lisp systems" [Steele and Gabriel, 1993].
Noel
> Computer pioneer Niklaus Wirth was born on this day in 1934; he basically
> designed ALGOL, one of the most influential languages ever, with just
> about every programming language in use today tracing its roots to it.
Rather than "tracing its roots to it", I'd say "has some roots in it".
Algol per se hardly made a ripple in the US market, partly due to
politics and habit, but also because it didn't espouse separate
compilation. However, as asserted above, it had a profound impact on
language designers and counts many languages as descendants.
To bring the subject back to Unix, C followed Fortran's modularity and
Algol's block structure. (But it reached back across the definitive Algol
60 to pick up the "for" statement from Algol 58.) Like PL/I, it also
borrowed the indispensable notion of structs from business languages
(Flowmatic, Comtran, Cobol). It adopted pointers from Lisp, as polished
by BCPL (pointer arithmetic) and PL/I (the -> operator). For better or
worse, it went its own way by omitting multidimensional arrays.
So C has many roots. It just isn't fashionable in computer-language
circles to highlight Cobol in your family tree.
Doug
> From: Larry McVoy <lm(a)mcvoy.com>
> I don't know all the details but lisp and performance is not a thing.
This isn't really about Unix, but I hate to see inaccuracies go into
archives...
You might want to read:
http://multicians.org/lcp.html
Of course, when it comes to the speed/efficientcy of the compiled code, much
depends on the program/programmer. If one uses CONS wildly, there will have to
be garbage collection, which is of course not fast. But properly coded to stay
away from expensive constructs, my understanding is that 'lcp' and NCOMPLR
produced pretty amazing object code.
Noel
Actually, Algol 60 did allow functions and procedures as arguments (with correct static scoping), but not as results, so they weren’t “first class” in the Scheme sense. The Algol 60 report (along with its predecessor and successor) is available, among other places, here:
http://www.softwarepreservation.org/projects/ALGOL/standards/
On Feb 16, 2018, Bakul Shah <bakul(a)bitblocks.com> wrote:
> They did lexical scoping "right", no doubt. But back when
> Landin first found that lambda calculus was useful for
> modeling programming languages these concepts were not clearly
> understood. I do not recall reading anything about whether
> Algol designers not allowing full lexical scopin was due to an
> oversight or realizing that efficient implementation of
> functional argument was not possible. May be Algol's call by
> name was deemed sufficient? At any rate Algol's not having
> full lexical scoping does not mean one can simply reject the
> idea of being influenced by it. Often at the start there is
> lots of fumbling before people get it right. May be someone
> should ask Steele?
Clueless or careless?
A customer program worked for many years till one of the transaction
messages had a few bytes added.
Looking into it I discovered that the program had only worked because
the receive buffer was followed by another buffer which was used in a
later sequence. Only when also that buffer overflowed some critical
integers got overwritten and used as index in tables that gave a lot
of fun.
Well, as all here know, C is fun :-)
> From: Larry McVoy <lm(a)mcvoy.com>
I am completely non-LISP person (I think my brain was wired in C before C
existed :-), but...
> Nobody has written a serious operating system
Well, the LISP Machine OS was written entirely in LISP. Dunno if you call that
a 'serious OS', but it was a significantly more capable OS than, say,
DOS. (OK, there was a lot of microcde that did a lot of the low-level stuff,
but...)
> or a serious $BIG_PROJECT in Lisp.
Have you ever seen a set of Symbolics manuals? Sylph-like, it wesn't!
> Not one that has been commercially successful, so far as I know.
It's true that Symbolics _eventually_ crashed, but I think the biggest factor
there was that commodity microprocessors (e.g. Pentium) got faster so much
faster than Symbolics' custom LISP hardware, so that the whole rationale for
Symbolics (custom hardware to run LISP fast) went away. They still exist as a
software company selling their coding environment, FWTW.
> C performs far better even though it is, in the eyes of lisp people, far
> more awkward to do things.
I think it depend on what you're doing. For some kinds of things, LISP is
probably better.
I mean, for most of the kind of things I do, I think C is the bees' knees
(well, except I had to add conditions and condition handlers when I went to
write a compiler in it), but for some of the AI projects I know a little
about, LISP seems (from a distance, admittedly) to be a better match.
Noel
On Feb 15, 2018, Ian Zimmerman <itz(a)very.loosely.org> wrote:
>>
>> So, how's this relevant to Unix? Well, I'd like to know more about the
>> historical interplay between the Unix and Lisp communities. What about
>> the Lisp done at Berkeley on the VAX (Franz Lisp).
>
> I know one of the Franz founders, I'll ask him when I have a chance.
There is some information about Franz Lisp and its origins here:
http://www.softwarepreservation.org/projects/LISP/maclisp_family/#Franz_Lis…
(And lots more information about many other varieties of Lisp at the same web site.)
On Sat, Feb 3, 2018 at 5:59 PM, Dave Horsfall <dave(a)horsfall.org> wrote:
> On Sat, 3 Feb 2018, Arthur Krewat wrote:
>
>> I would imagine that Windows wouldn't be what it is today without UNIX.
>> Matter of fact, Windows NT (which is what Windows has been based on since
>> Windows ME went away) is really DEC's VMS underneath the covers at least to
>> a small extent.
>>
>
> I thought that NT has a POSIX-y kernel, which is why it was so reliable?
> Or was VMS a POSIX-like system? I only used it for a couple of years in
> the early 80s (up to 4.0, I think), and never dug inside it; to me, it was
> just RSX-11/RSTS-11 on steroids.
The design of the original NT kernel was overseen by Dave Cutler, of VMS
and RSX-11M fame, and had a very strong and apparent VMS influence. Some
VAX wizards I know told me that they saw a lot of VMS in NT's design, but
that it probably wasn't as good (different design goals, etc: apparently
Gates wanted DOS++ and a quick time to market; Cutler wanted to do a *real*
OS and they compromised to wind up with VMS--).
It's true that there was (is? I don't know anymore...) a POSIX subsystem,
but that seemed more oriented at being a marketing check in the box for
sales to the US government and DoD (which had "standardized" on POSIX and
made it a requirement when investing in new systems).
Now days, I understand that one can run Linux binaries natively; the
Linux-compatibility subsystem will even `apt-get install` dependencies for
you. Satya Nadella's company isn't your father's Microsoft anymore. VSCode
(their new snazzy editor that apparently all the kids love) is Open Source.
Note that there is some irony in the NT/POSIX thing: the US Government
standardized on Windows about two decades ago and now can't seem to figure
out how to get off of it.
A short story I can't resist telling: a couple of years ago, some folks
tried to recruit me back into the Marine Corps in some kind of technical
capacity. I asked if I'd be doing, you know, technical stuff and was told
that, since I was an officer no, I wouldn't. Not really interested. I ended
up going to a bar with a recon operator (Marine special operations) to get
the straight scoop and talking to a light colonel (that's a Lieutenant
Colonel) on the phone for an hour for the hard sell. Over a beer, the recon
bubba basically said, "It was weird. I went back to the infantry." The
colonel kept asking me why I didn't run Windows: "but it's the most popular
operating system in the world!" Actually, I suspect Linux and BSD in the
guise of iOS/macOS is running on a lot more devices than Windows at this
point. I didn't bother pointing that out to him.
Would VMS become what it was without UNIX's influence? Would UNIX become
>> what it later was without VMS?
>>
>> Would UNIX exist, or even be close to what it became without DEC?
>>
>
> I've oft wondered that, but we have to use a new thread to avoid
> embarrassing Ken :-)
>
The speculation of, "what would have happened?" is interesting, though of
course unanswerable. I suspect that had it not been for Unix, we'd all be
running software that was closer to what you'd find on a mainframe or RT-11.
- Dan C.