Spurred on by Bryan, I thought I should properly introduce myself:
I am a fairly young Unix devotee, having gotten my start with System V on a Wang word processing system (believe it or not, they made one!), at my mother’s office, in the late 1980s. My first personal system, which ran SLS Linux, came about in 1992.
I am a member of the Vintage Computing Federation, and have given talks and made exhibits on Unix history at VCF’s museum, in Wall, New Jersey. I have also had the pleasure to show Brian Kernighan and Ken Thompson, who are two of my computing heroes, my exhibit on the origins of BSD Unix on the Intel 386. I learned C from Brian’s book, as probably did many others here.
I have spent my entire professional career supporting Unix, in some form or another. I started with SunOS at the National Institutes of Health, in Bethesda, Maryland, and moved on to Solaris, HP-UX, SCO, and finally Linux. I worked for AT&T, in Virginia, in the early 2000s, but there were few vestiges of Unix present, other than some 3b1 and 3b2 monitors and keyboards.
I current work for Red Hat, in Tyson’s Corner, Virginia, as a principal sales engineer, where I spend most of my time teaching and presenting at conferences, both in person and virtual.
Thank you to everyone here who created the tools that have enabled my career and love of computing!
- Alexander Jacocks
Hello!
I have just joined this mailing list recently, and figured I would give
an introduction to myself.
My first encounter with Unix took place in 2006 when I started my
undergraduate studies in Computer Science. The main servers all ran
Solaris, and we accessed them via thin clients. Eventually I wanted a
similar feeling operating system for my personal computer, so that I
could do my assignments without having to always log into the school
servers, and so I came across Linux. I hopped around for a while, but
eventually settled with Slackware for my personal computers. Nowadays I
run a mixture of Linux and BSD operating systems for various purposes.
Unfortunately my day job has me writing desktop software for Windows (no
fun there :(), so I'm thankful to have found a group of people with
similar computing interests as myself, and I look forward to chatting
with you all!
Regards,
Bryan St. Amour
OK, this is my last _civil_ request to stop email-bombing both lists with
trafic. In the future, I will say publicly _exactly_ what I think - and if
screens still had phosphor, it would probably peel it off.
I can see that there are cases when one might validly want to post to both
lists - e.g. when starting a new discusson. However, one of the two should
_always_ be BCC'd, so that simple use of reply won't generate a copy to
both. I would suggest that one might say something like 'this discussion is
probably best continued on the <foo> list' - which could be seeded by BCCing
the _other_.
Thank you.
Noel
http://www.cs.ox.ac.uk/jeremy.gibbons/publications/fission.pdf
Duncan Mak wrote
> Haskell's powerful higher-level functions
> make middling fragments of code very clear, but can compress large
> code to opacity. Jeremy Gibbons, a high priest of functional
> programming, even wrote a paper about deconstructing such wonders for
> improved readability.
>
I went looking for this paper by Jeremy Gibbons here:
https://dblp.org/pid/53/1090.html but didn't find anything resembling it.
What's the name of the paper?
All, I got this e-mail forwarded on from John Fox via Eric S. Raymond.
Cheers, Warren
Hi Eric, I think you might find this interesting.
I have a 2001 copy of your book. I dog-eared page 9 twenty years ago
because of this section:
It spread very rapidly with AT&T, in spite of the lack of any
formal support program for it. By 1980 it had spread to a large
number of university and research computing sites, and thousands
of hackers considered it home.
Regarding the "spread", I believe one of the contributing factors
was AT&T's decision to give the source code away to universities.
And in doing so, unwittingly provided the fertile soil for open
source development.
I happen to know the man who made that decision. He was my
father-in-law. He died Tuesday. He had no idea what UNIX was, and
had no idea what his decision helped to create. Funny when things we
do have such a major impact without us even knowing. That was
certainly true in this case.
Anyway, I thought you'd be interested to know. His name is John
(Jack) H. Bolt. He was 95.
PS, before making the decision, he called Ken Olson at DEC to see if
he'd be interested in buying it, lock, stock, and barrel. Jack's
opening offer was $250k. Olson wasn't interested. And on that,
Jack's decision was made.
John Fox
>> The former notation C(B(A)) became A->B->C. This was PL/I's gift to C.
> You seem to have a gift for notation. That's rare. Curious what you think of APL?
I take credit as a go-between, not as an inventor. Ken Knowlton
introduced the notation ABC in BEFLIX, a pixel-based animation
language. Ken didn't need an operator because identifiers were single
letters. I showed Ken's scheme to Bud Lawson, the originator of PL/I's
pointer facility. Bud liked it and came up with the vivid -> notation
to accommodate longer identifiers.
If I had a real gift of notation I would have come up with the pipe
symbol. In my original notation ls|wc was written ls>wc>. Ken Thompson
invented | a couple of months later. That was so influential that
recently, in a paper that had nothing to do with Unix, I saw |
referred to as the "pipe character"!
APL is a fascinating invention, but can be so compact as to be
inscrutable. (I confess not to have practiced APL enough to become
fluent.) In the same vein, Haskell's powerful higher-level functions
make middling fragments of code very clear, but can compress large
code to opacity. Jeremy Gibbons, a high priest of functional
programming, even wrote a paper about deconstructing such wonders for
improved readability.
Human impatience balks at tarrying over a saying that puts so much in
a small space. Yet it helps once you learn it. Try reading transcripts
of medieval Arabic algebra carried out in words rather than symbols.
Iverson's hardware descriptions in APL are another case where
symbology pays off.
Doug
Hi All.
Mainly for fun (sic), I decided to revive the Ratfor (Rational
Fortran) preprocessor. Please see:
https://github.com/arnoldrobbins/ratfor
I started with the V6 code, then added the V7, V8 and V10 versions
on top of it. Each one has its own branch so that you can look
at the original code, if you wish. The man page and the paper from
the V7 manual are also included.
Starting with the Tenth Edition version, I set about to modernize
the code and get it to compile and run on a modern-day system.
(ANSI style declarations and function headers, modern include files,
use of getopt, and most importantly, correct use of Yacc yyval and
yylval variables.)
You will need Berkely Yacc installed as byacc in order to build it.
I have only touch-tested it, but so far it seems OK. 'make' runs in like 2
seconds, really quick. On my Ubuntu Linux systems, it compiles with
no warnings.
I hope to eventually add a test suite also, if I can steal some time.
Before anyone asks, no, I don't think anybody today has any real use
for it. This was simply "for fun", and because Ratfor has a soft
spot in my heart. "Software Tools" was, for me, the most influential
programming book that I ever read. I don't think there's a better
book to convey the "zen" of Unix.
Thanks,
Arnold
I believe that the PDP-11 ISA was defined at a time when DEC was still using
random logic rather than a control store (which came pretty soon
thereafter). Given a random logic design it's efficient to organize the ISA
encoding to maximize its regularity. Probably also of some benefit to
compilers in a memory-constrained environment?
I'm not sure at what point in time we can say "lots of processors" had moved
to a control store based implementation. Certainly the IBM System/360 was
there in the mid-60's. HP was there by the late 60's.
-----Original Message-----
From: TUHS <tuhs-bounces(a)minnie.tuhs.org> On Behalf Of Larry McVoy
Sent: Monday, November 29, 2021 10:18 PM
To: Clem Cole <clemc(a)ccc.com>
Cc: TUHS main list <tuhs(a)minnie.tuhs.org>; Eugene Miya <eugene(a)soe.ucsc.edu>
Subject: Re: [TUHS] A New History of Modern Computing - my thoughts
On Sun, Nov 28, 2021 at 05:12:44PM -0800, Larry McVoy wrote:
> I remember Ken Witte (my TA for the PDP-11 class) trying to get me to
> see how easy it was to read the octal. If I remember correctly (and I
> probably don't, this was ~40 years ago), the instructions were divided
> into fields, so instruction, operand, operand and it was all regular,
> so you could see that this was some form of an add or whatever, it got
> the values from these registers and put it in that register.
I've looked it up and it is pretty much as Ken described. The weird thing
is that there is no need to do it like the PDP-11 did it, you could use
random numbers for each instruction and lots of processors did pretty much
that. The PDP-11 didn't, it was very uniform to the point that Ken's
ability to read octal made perfect sense. I was never that good but a
little google and reading and I can see how he got there.
...
--lm
For DEC memo’s on designing the PDP-11 see bitsavers:
http://www.bitsavers.org/pdf/dec/pdp11/memos/
(thank you Bitsavers! I love that archive)
Ad van de Goor (author of a few of the memo’s) was my MSc thesis professor. I recall him saying in the early 80’s that in his view the PDP-11 should have been an 18-bit machine; he reasoned that even in the late 60’s it was obvious that 16-bits of address space was not enough for the lifespan of the design.
---
For those who want to experiment with FPGA’s and ancient ISA’s, here is my plain Verilog code for the TI 9995 chip, which has an instruction set that is highly reminiscent of the PDP-11:
https://gitlab.com/pnru/cortex/-/tree/master
The actual CPU code (TMS99095.v) is less than 1000 lines of code.
Paul