Any takers for a (free) two-volume 7th Ed manual (1983), or ring-bound 8th Ed (1985), or PDP11 processor handbook (1981)? These would need to be picked up in Lindfield, Sydney, Australia. Condition is fair, but they've been in storage for 35 years so are slightly mouldy, but still perfectly usable. Images at http://jon.es/other/7th-ed.jpg and http://jon.es/other/8th-ed.jpg If you’d like them, let me know in email ASAP please.
Regards,
Terry Jones
> From: Angelo Papenhoff
> to my knowledge no troff version before the C rewrite in v7
Apologies if I missed something, but between this list and COFF there's so
much low S/N traffic I skip a lot of it. Having said that, was there ever a
troff in assembler? I'd always had the impression that the first one was in C.
> The v6 distribution has deleted directory entries for troff source but
> not the files themselves. I hope it is not lost. Maybe someone here has
> an idea where it could be found?
The MIT 'V6+' (I think it's probably basically PWB1) system had troff -
i guess it 'fell off the back of a truck', like a lot of other stuff MIT had,
such as 'typesetter C', the Portable C Compiler, etc.
Theirs was modified to produce output for a Varian (I forget which model,
maybe the docs or driver say).
nroff on that system seems to have been generated from the troff sources; the
assembler nroff sources aren't present.
I looked at its n1.c, and compared it to the V7 one:
https://minnie.tuhs.org/cgi-bin/utree.pl?file=V7/usr/src/cmd/troff/n1.c
and this one appears to be slightly earlier; e.g. it starts:
#include "tdef.h"
#include "t.h"
#include "tw.h"
/*
troff1.c
consume options, initialization, main loop,
input routines, escape function calling
*/
extern int stdi;
and in the argument processing, it has quite a lot fewer.
So that one is a "troff version before the C rewrite in .. v7", but it is in
C. Is that of any interest?
Noel
Most of y'all are aware of Brian Kernighan's troff involvement. My
understanding is that he pretty much took over nroff/troff after Joe Ossana
died, and came out with ditroff.
But Brian had much earlier involvement with non-UNIX *roff. When he was
pursuing his PhD at Princeton, he spent a summer at MIT using CTSS and
RUNOFF. When he came back to P'ton, he wrote a ROFF for the IBM 7094,
later translated to the IBM 360. Many generations of students, myself
included, use the IBM ROFF (batch, not interactive) as a much friendlier
alternative to dumb typewriters. I don't know if 360 ROFF spread beyond
Princeton, but I wouldn't be surprised.
BTW, during my summer at Bell, nroff/troff was one of the few programs I
could not port to the Interdata 8/32 - it was just a mess of essentially
typeless code. I don't think Joe Ossana got around to it either before he
died.
--
- Tom
Hello All.
We recently discussed Brenda Baker's struct program, that read Fortran
and generated Ratfor. Many of us remarked as to what a really cool
program it was and how much we admired it, myself included.
For fun (for some definition of "fun") I decided to try to bring the code
into the present. I set up a GitHub repo with the V7, V8 and V10 code,
and then started work in a separate branch.
(https://github.com/arnoldrobbins/struct, branch "modernize".)
The program has three main parts:
- structure, which reads Fortran and outputs something that is
almost Ratfor on standard output.
- beautify, which reads the output of structure and finishes the job,
primarily making conditions readable (.not. --> !, removing double
negatives, etc.)
- struct.sh - a simple shell script that runs the above two components.
This is what the user invokes.
The code was written in 1974. As such, it is rife with "type punning"
between int, int *, int **, and char *. These produce a lot of warnings
from a modern day C compiler. The code also uses a, er, "unique" bracing
style, making it nearly illegible to my stuck-in-a-rut programming brain.
Here is what I've accomplished so far:
* Converted every function definition and declaration to use modern (ANSI)
C style, adding a header file with function declarations that is
included everywhere.
* Run all the code through the indent program, formatting it as traditional
K&R bracing style, with tabs.
* Fixed some macros to use modern style for getting parameter values as strings
into the macros.
* Fixed a few small bugs here and there.
* Fixed beautify to work with modern byacc/bison (%union) and to work with
flex instead of lex. This latter was a challenge.
In structure, only three files still generate warnings, but they all relate
to integer <--> pointer assignment / use as. However, when compiled in
32 bit mode (gcc -m32), where sizeof(int) is the same as sizeof(pointer),
despite the warnings, structure works!!
Beautify works, whether compiled in 32 or 64 bit mode.
What I've done so far has been mostly mechanical. I hereby request help from
anyone who's interested in making progress on "the hard part" --- those three
files that still generate warnings.
I think the right way to go is to replace int's with a union that holds and
int, a char* and an int*. But I have not had the quiet time to dive into
the code to see if this can be done.
Anyone who has some time to devote to this and is interested, please drop
me a note off-list.
Thanks,
Arnold Robbins
This is clearly getting off track of TUHS. I'll stop
after this reply.
> *From:* Blake McBride <blake1024(a)gmail.com>
> *Date:* January 11, 2022 at 2:48:23 PM PST
> *To:* Jon Forrest <nobozo(a)gmail.com>
> *Cc:* TUHS main list <tuhs(a)minnie.tuhs.org>
> *Subject:* *[TUHS] TeX and groff (was: roff(7))*
> Although I'm not connected with the TeX community, I don't agree with
> much of what you said.
>
> 1. TeX source to C is fine - stable and works. It would be
> impossible to rewrite TeX in any other language without introducing
> bugs and incompatibilities. Leaving TeX as-is means that it can be
> converted to other languages too if/when C goes out of style. TeX
> as-is is exactly what it is. Anything else wouldn't be TeX.
I agree that Web->C works but it's a major obstacle in doing any
development work on TeX. Try making a major change in the Web source
that requires debugging.
Anything that can pass the TeX Trip Test can be called TeX. I know of
a full C reimplementation that passes the test but the author doesn't
want to make it free software.
There are other rewrites out there that could be candidates but someone
will enough power will have to proclaim one as the official TeX
alternative.
> 2. Drop DVI? Are you kidding me? Although PDF may be popular now,
> that may not be the case 20 years from now. A device-independent
> format is what is needed, and that's what DVI is. TeX is guaranteed
> to produce the exact same output 100 years from now.
And .PDF isn't?
.DVI was great until .PDF matured. .DVI has almost no penetration
these days, whereas .PDF is everywhere. I'm not saying that .PDF
will always be the proper alternative but a properly rewritten TeX
should make it much easier to replace .PDF will whatever comes
next.
> 3. I am curious about memory limitations within TeX.
TeX has various fixed sized memory pools, and contains clever code
to work around limited memory. Some of the newer TeXs,
like LuaTeX, use dynamic allocation but this isn't official.
Given how primitive things were when TeX was developed it's a
miracle it works as well as it does.
> 4. Knuth is getting up in age. Someone will have to take over.
Exactly. I don't follow the TeX community so I don't know what
they're doing about this.
Jon Forrest
I've been meaning to ask about this for a while....
"... The reason why is because there was tremendous antagonism between New
York and L.A. L.A. was, you know, full of color, full of acid, full of
hippies, and we were not like that.
We dressed in black and white. We did not like free love. ..... We took
amphetamine; they took LSD. They were, you know, sort of loving and happy,
and we were - we weren't really evil, we were more intellectual, more about
art."
[Mary Woronov, in an interview with NPR's Terry Gross on "Fresh Air",
talking about New York City, Warhol's Factory and shows in Los Angeles
while touring with the Velvet Underground:
http://www.npr.org/templates/transcript/transcript.php?storyId=241437872]
Note: I am not suggesting that anyone involved with Unix ever took
amphetamines, nor, despite the usual crack about LSD and BSD, that anyone
on the west coast was taking acid, though Markov's "What the Dormouse Said"
would indicate that many of you WERE tripping.
It seems like Unix is largely a child of the coasts. Notable work in Utah,
Colorado and Chicago aside, it seems the bulk of early Unix work happened
in either the greater New York metro area in northern New Jersey or the
greater Bay area around San Francisco. Notable work was also done in
Massachusetts, but again, that's a coastal state and I think it's fair to
say that most of that was inside the route 128 corridor. Of course work was
done internationally, but I'm particularly curious about differences in US
culture here, and how they influenced things.
The question is, to what extent did differences in coastal cultures
influence things like design aesthetics? I think it's is accurate to
characterize early BTL Unix by it's minimalism, and others have echoed this
(cf. Richard Gabriel in the "Worse is Better" papers). But similarly, BSD
has always felt like a larger system -- didn't Lions go as far as to quip
about the succinctness of 6th Edition being "fixed" by 4BSD?
Anyway, I believe it is fair to say that early Unix has a rather distinct
feel from later BSD-derived systems and the two did evolve in different
geographic locations. Furthermore, the world was not as connected then as
it is now.
So to what extent, if any, was this a function of the larger cultural
forces at play near where that work was taking place?
- Dan C.
> If I can be so bold as to offer an interpretation: Doug's approximations
> treat ellipses as mathematical objects and algorithmically determine what
> pixels are closest to points on the infinitesimally-thin curves, while
> Knuth's (or one his students') method acknowledges that the curve has a
> width defined by the nib
Just so.
> I find it impossible that neither Knuth nor Hobby were unaware of McIlroy's
> work and vice-versa; of course he would have known about and examined troff
> just as the Bell Labs folks knew about TeX.
We were generally aware of each other's work. My papers on drawing
lines, circles, and ellipses on rasters, though, were barely connected
to troff. Troff did not contain any drawing algorithms. That work was
relegated to the rendering programs that interpreted ditroff output.
Thus publication-quality rendering with support for thick lines was
outsourced to Adobe and Mergenthaler.
Various PostScript or ditroff postprocessors for screen-based
terminals were written in house. These programs paid little or no
attention to fonts and line widths. But the blit renderers made a
tenuous connection between my ellipse algorithm and troff, since my
work on the topic was stimulated by Rob's need for an ellipse
generator.
Doug
> From: Bakul Shah
> My guess is *not* storing a path instead of a ptr to the inode was done
> to save on memory.
More probably speed; those old disks were not fast, and on a PDP-11, disk
caches were so small that converting the path to the current directory to its
in memory inode could take a bunch of disk reads.
> Every inode has a linkcount so detecting when the last conn. is severed
> not a problem.
Depends; if a directory _has_ to be empty before it can be deleted, maybe; but
if not, no. (Consider if /a/b/c/d exists, and /a/b is removed; the tree
underneath it has to be walked and the components deleted. That could take a
while...) In the general case (e.g. without the restriction to a tree), it's
basically the same problem as garbage collection in LISP.
Noel
> From: Dan Cross
> a port of the _CTSS_ BCPL ROFF sources purportedly written by Doug. I
> wonder if that was actually a thing, or an error?
> ...
> Fortunately, the source [of the original CTSS runoff] is online:
> ...
> Indeed; one finds the following in at least one of the Multics RUNOFF
> source files:
It sounds like all the steps in the chain have pretty definitive evidence -
_except_ whether there was ever a CTSS RUNOFF in BCPL, from Doug.
Happily, we have someone here who should be able to answer that! :-)
Noel
Been reading the heirloom docs. Remember one thing that I disliked
about troff which maybe Doug can explain. It's the language in the
docs. I never understood "interpolating a register" to have any
relation to the definition of interpolate that I learned in math.
Made it a bit hard to learn initially.
Any memory of why that term was used?