> From: Ralph Corderoy
> Then the real definition, ending in an execution of the empty `q'.
> qq/4$^Ma2^[@qq
Gah. That reminds me of nothing so much as TECO (may it long Rest in Peace).
Noel
>Speaking of which, am I the only one annoyed by Penguin/OS' silly coloured
"ls" output?
Syntax coloring, of which IDE's seem to be enamored, always
looks to me like a ransom note. For folks who like colorized
text, Writers Workbench had a tool that can be harnessed to
do a bang-up job of syntax colorizing for English: "parts"
did a remarkable job of inferring parts of spechc in running
text.
Doug
I started with V6 Unix at UC Santa Barbara in 1977. I remember
that when V7 came out, I learned about the 'make' program and
started using it with great success to help me build a large
Fortran package for signal processing.
For its size, there was a lot going on in Santa Barbara at that
time. It was one of the first 4 Arpanet nodes, and there were
a bunch of companies making networking products and doing speech
research as a result.
I was a student at UC Santa Barbara but I started toying with
the idea of finding a real job, mostly to make more money.
I found several possibilities and went to interview at one.
This place had an a need for somebody to, in essence, be a
human 'make' program. The computer they used, some kind of
Data General, was so slow that they couldn't do a build more that
once or twice a day. So, in an attempt to speed up the build,
they wanted to hire somebody who would, by hand, keep track
of the last modification date of all the components in the
package they sold, and do a build that only performed
the necessary steps to generate the package - in other
words a human make program. Apparently they figured that
this would save enough time to justify the $24K salary they
were willing to pay. $24K in 1978 wasn't a bad salary at all.
I didn't take the job, but I've often thought that what I should
have done would have been to take the job under the condition
that I could mostly work remotely. Then, I could have used the
'make' program on our V7 Unix system to generate the optimal
script to build the package, and then taken the script back
to the company to run on the Data General computer. I figure
this would have taken maybe an hour a day. The rest of the time
I could have spent on the beach thinking about ways to spend that
$24K.
Jon Forrest
The infamous Morris Worm was released in 1988; making use of known
vulnerabilities in Sendmail/finger/RSH (and weak passwords), it took out a
metric shitload of SUN-3s and 4BSD Vaxen (the author claimed that it was
accidental, but the idiot hadn't tested it on an isolated network first).
A temporary "condom" was discovered by Rich Kulawiec with "mkdir /tmp/sh".
--
Dave Horsfall DTM (VK2KFU) "Those who don't understand security will suffer."
Hi All.
In 1983, while a grad student at Ga Tech, I did some contract programming
at Southern Bell. The system was a PDP 11/70 running USG Unix 4.0 (yes,
it existed! That's another story.)
Beside ed, the system had a screen editor named 'se' (not related to the
Ga Tech 'se' screen editor). It apparently was written within AT&T.
ISTR that it was written mainly for Vaxen but had been squeezed and made to
run on the PDP 11.
Did anyone else ever use this? Know anything about it? I never saw it
or heard it about it again; it didn't ship with System V.
Thanks,
Arnold
I am somewhat embarrassed to admit that this just occurred to me. Is the
reason that SIGKILL has the numeric value 9 because cats are reported to
have nine lives? Clearly the connection between 'cat' and 'kill -9' would
make for a irreverent but harmless inside joke if so....
- Dan C.
> I especially liked the bit in which Tom's virus infected a multi-level secured UNIX system that Doug McIlroy and Jim Reeds were developing which they didn't spot until they turned on all their protections ... and programs started crashing all over the place.
That's not quite right. The system was running nicely with a
lattice-based protection system (read from below, write to above)
working fine. Processes typocally begin at lattice bottom, but
move to hivel levels depending on what data they see (including,
of course any exec-ed file.) All the standard utilities, being
usable by anyone are at lattice bottom.
Cool, until you realize that highly trusted system programs
such as sudo are at lattice bottom and are protected only by
the old rwx bits, not by the read-write rules. So, following
an idea of Biba's, that integrity rules are the opposite of
secrecy rules. You don't want to forbid writing to high-integrity
places, nor read from low-integrity places.
This was done by setting the default security level away from
the lattice bottom. High-integrity stuff was below this floor;
high-secrecy above.
The Duff story is about the day we moved the floor off bottom.
An integrity violation during the boot sequence stopped the
system cold. Clearly we'd misconfigured something. But no, after
a couple of days of fruitless search, Jim Reeds lit up, "We
caught a virus!" We were unaware of Duff's experiment. He had
been chagrined when it escaped from one machine, but was able
to decontaminate all the machines in the center. Except ours,
which was not on the automatic software distrutioin list, since
it was running a different system.
> From: Andy Kosela
> That is why MIT and Bell Labs represented two very different cultures.
Oi! Not _everyone_ at MIT follows the "so complicated that there are no
obvious deficiencies" approach (to quote Hoare's wonderful aphorism from his
'Emperor's Old Clothes' Turing Award Lecture).
My personal design mantra (it's been at the top of my home page for decades)
is something I found as a footnote in Corbato and Saltzer, 'Multics: The First
Seven Years': "In anything at all, perfection has been attained, not when
there is nothing left to add, but when there is nothing left to take away..."
No doubt some people would be bemused that this should be in a Multics paper,
given the impression people have of Multics as incredibly - overly -
complicated. I'll avoid that discussion for the moment...
I've often tried to understand why some people create these incredibly
complicated systems. (Looking at the voluminous LISP Machine manual set from
Symbolics particularly caused this train of thought...) I think it's because
they are too smart - they can remember all that stuff.
Maybe my brain isn't like that (or perhaps I use large parts of it for other
stuff, like Japanese woodblock prints :-), but I much prefer simpler things.
Or maybe I'm just basically lazy, and like simpler things because they are
easier...
Noel
Hi,
ed(1) pre-dates pipes. When pipes came along, stderr was needed, and
lots of new idioms were found to make use of them. Why didn't ed gain a
`filter' command to accompany `r !foo' and `w !bar'?
To sort this paragraph, I
;/^$/w !sort >t
;/^$/d
-r t
I'd have thought that filtering was common enough to suggest a `^'
command with an implied `!'? (Not `|' since that was uncommon then.)
ex(1) has `!' that filters if applied to a range of lines, and this
carries through to vi's `!' that's often heavily used, especially when
the "file" is just a scratch buffer of commands, input, and output.
--
Cheers, Ralph.
https://plus.google.com/+RalphCorderoy
There's a story I heard once in supercomputing circles from the 80s, that
Ken visited CRI in Minneapolis, sat down at the console of a machine
running the then-new port of Unix to one of the Crays, typed a command, and
said something like "ah, that bug is still there."
Anybody know what the bug was?
It's time to assert my editorial control and say: no more 80 cols please!
Anybody who mentions 80 cols will be forced to use either a Hazeltine or
an ADM3 (not 3a) for a month.
Thanks, Warren
Jim "wkt" Moriarty:
> Anybody who mentions 80 cols will be forced to use either a Hazeltine or
> an ADM3 (not 3a) for a month.
=====
So who has a modern emulator for either of those terminals?
Norman Wilson
Toronto ON
(Still not really in Toronto, but no longer in Texas)
Does anyone know if the image
http://www.tuhs.org/Archive/Distributions/Research/Dennis_v6/v6root.gz
is somehow bootable as-is?
I wasn't able to figure out how to get it to boot, so I went on a quest
to make it bootable. Here's what I did - let me know if this was
overkill or misguided.
Basically, I downloaded the known bootable v6 distribution tape from
Wellsch directory in TUHS. I then extracted 101 blocks from the image
(tmrk, a bootblock, and who knows what else, but seriously what else is
on those first 100 blocks?), converted it to a simh compatible tape
format, and booted a simh generic pdp11/40 with my new little boot tape
and Dennis's root disk attached. I used tmrk to copy the bootstrap from
my little tape to Dennis's root disk (am I clobbering anything
important?). Then voila - it was bootable :)! I could have done it
straight off Ken's tape (after converting it to a simh tape format), but
I wanted to keep the little tape image around for use in other contexts.
Details for the curious are here:
https://decuser.github.io/bootable-tape-v6.txt
I thought the Ken Wellsch tape was basically the same as the Dennis
Ritchie disks, but now I'm not so sure - on Ken's tape, it boots to:
@rkunix
mem = 1035
RESTRICTED RIGHTS
Use, duplication or disclosure is subject to
restrictions stated in Contract with Western
Electric Company, Inc.
#
on Dennis' it boots to:
@rkunix
mem = 1036
#
Makes me curious to see what else is different. Maybe Dennis's was prior
to preparing an official distro where the rights were added to the kernel?
Will
--
GPG Fingerprint: 68F4 B3BD 1730 555A 4462 7D45 3EAA 5B6D A982 BAAF
Nemo:
And for that reason, I have never used Python. (I have a mental block
about that.)
====
I used to feel the same way. A few years ago I held my nose
and jumped in. I'm glad I did; Python is a nifty little
language that, now I know it, hits a sweet spot twixt low-level
C and high-level shell and awk scripts.
Denoting blocks solely by indentation isn't at all bad once
you do it; no worse than adapting from do ... end to C's {}.
What still bugs me about Python:
-- It is unreasonably messy to give someone else a copy of
a program composed of many internal modules. Apparently
you are expected to give her a handful of files, to be
installed in some directory whose name must be added to
the search path in every Python source file that imports
them. I have come up with my own hacky workaround but it
would be nice if the language provided a graceful way to,
e.g., catenate multiple modules into a single source file
for distribution.
-- I miss C's style of for loop, though not often. (Not
quite everything can be turned into a list or an iterator.)
-- I miss one particular case of assigment having a value:
that of
while ((val = function()) != STOP)
do something with val
Again, there are clunky ways to work around this.
As for 80 columns, I'm firmly in the camp that says that
if you need a lot of indentation you're doing it wrong.
Usually it means you should be pulling the details out
into separate functions. Functions that run on for many,
many lines (once upon a time it was meaningful to say
for many pages, but no longer) are just as bad, for the
same reason: it's hard to read and understand the code,
because you have to keep so many details in your head at
once.
Likewise for excessive use of global variables, for that
matter, a flaw that is still quite common in C code.
Having to break an expression or a function call over
multiple lines is more problematic. It's clearer not
to have to do that. It helps not to make function or
variable names longer than necessary, but one can carry
that too far too.
Style and clarity are hard, but they are what distinguishes
a crap hack programmer from a good one.
Norman Wilson
Toronto ON
(Sitting on the lower level of a train in Texas,
not on a pedestal)
So, 80 column folks, would you find this
a(b,
c,
d)
more readable than
a(b,c,d)
(this is a real example, with slightly shortened names)
would you have code review software that automatically bounces out lines
that are 82 columns wide? How far does this go?
I do recall 80 column monitors, but I started on 132 column decwriter IIs
and hence have never had sympathy for 80 columns. It's weird that so many
punched-card standards are required in our code bases now (see: Linux).
moving away from serious ... (look for Presottos' I feel so liberated ...)
http://comp.os.plan9.narkive.com/4W8iThHW/9fans-acme-fonts
Hi,
Everyone on the list is well aware that running V7 in a modern simulator
like SIMH is not a period realistic environment and some of the
"problems" facing the novice enthusiast are considerably different from
those of the era (my terminal is orders of magnitude faster and my
"tape" is a file on a disk). However, many of the challenges facing
someone in 1980, remain for the enthusiast, such as how to run various
commands successfully and how devices interoperate with unix. Of course,
we have do resources and some overlapping experience to draw on -
duckduckgo (googleish), tuhs member experience, and exposure to modern
derivatives like linux, macos, bsd, etc. We also have documentation of
the system in the form of the Programmer's Guide - as pdfs and to some
degree as man pages on the system (haven't found volume 2 documentation
on the instance).
My question for you citizens of that long-ago era :), is this - what was
it like to sit down and learn unix V7 on a PDP? Not from a hardware or
ergonomics perspective, but from a human information processing
perspective. What resources did you consult in your early days and what
did the workflow look like in practical terms.
As an example - today, when I want to know how to accomplish a task in
modern unix, I:
1. Search my own experience and knowledge. If I know the answer, duh, I
know it.
2. Decide if I have enough information about the task to guess at the
requisite commands. If I do, then man command is my friend. If not,
I try man -k task or apropos task where task is a one word summary
of what I'm trying to accomplish.
3. If that fails, then I search for the task online and try what other
folks have done in similar circumstances.
4. If that fails, then I look for an OS specific help list
(linux-mint-help, freebsd forums, etc), do another search there, and
post a question.
5. If that fails, or takes a while, and I know someone with enough
knowledge to help, I ask them.
6. I find and scan relevant literature or books on the subject for
something related.
Repeat as needed.
Programming requires some additional steps:
1. look at source files including headers and code.
2. look at library dependencies
3. ask on dev lists
but otherwise, is similar.
In V7, it's trickier because apropos doesn't exist, or the functional
equivalent man -k, for that matter and books are hard to find (most deal
with System V or BSD. I do find the command 'find /usr/man -name "*" -a
-print | grep task' to be useful in finding man pages, but it's not as
general as apropos.
So, what was the process of learning unix like in the V7 days? What were
your goto resources? More than just man and the sources? Any particular
notes, articles, posts, or books that were really helpful (I found the
article, not the book, "The Unix Programming Environment" by Kernighan
and Mashey, to be enlightening
https://www.computer.org/csdl/mags/co/1981/04/01667315.pdf)?
Regards,
Will
Reading in the AUUGN vol 1 number 4, p. 15 in letter dated April 5,
1979, from Alistair Kilgour, Glasgow writing to Ian Johnstone, New South
Wales about a Unix meeting in the U.K. at University of Kent at
Caterbury (150 attended the meeting) with Ken Thompson and Brian
Kernighan...
Two paragraphs that I found interesting and fun:
Most U.K. users were astonished to hear that one thing which has
_not_ changed in Version 7 is the default for "delete character" and
"delete line" in the teletype handler - we thought we'd seen the last of
# and @! What was very clear was that version 7 is a "snapshot" of a
still developing system, and indeed neither speaker seemed quite sure of
when the snapshot was taken or exactly what it contained. The general
feeling among users at the meeting was that the new tools provided with
version 7 were too good to resist (though many had doubts about the new
Shell). We were however relieved by the assurance that there would
_never_ be a version 8!
...
Finally a quotation, attributed to Steve Johnstone, with which
Brian Kernighan introduced his excellent sales campaign for Unix on the
first day of the conference: " Using TSO is like kicking a dead whale
along the beach". Unix rules.
...
I knew it, it's not just me - those pesky # and @ characters were and
still are really annoying! Oh, and never say never. Unix does rule :).
Will
--
GPG Fingerprint: 68F4 B3BD 1730 555A 4462 7D45 3EAA 5B6D A982 BAAF
I’m trying to understand the origins of void pointers in C. I think they first appeared formally in the C89 spec, but may have existed in earlier compilers.
Of course, as late as V7 there wasn’t much enforcement of types and hence no need for the void* concept and automatic casting. I suppose ‘lint’ would have flagged it though.
In the 4BSD era there was caddr_t, which I think was used for pretty much the same purposes. Did ‘lint’ in the 4BSD era complain about omitted casts to and fro’ caddr_t?
Background to my question is research into the evolution of the socket API in 4.1x BSD and the persistence of ‘struct sockaddr *’ in actual code, even though the design papers show an intent for ‘caddr_t’ (presumably with ‘void*’ semantics, but I’m not sure).
Paul
> From: Will Senn
> what was it like to sit down and learn unix V7 on a PDP? ... What
> resources did you consult in your early days
Well, I started by reading through the UPM (the 8-section thing, with commands
in I, system calls in II, etc). I also read a lot of Unix documentation which
came as larger documents (e.g the Unix Intro, C Tutorial and spec, etc).
I should point out that back then, this was a feasible task. Most man pages
were really _a_ page, and often a short one. By the end of my time on the PWB1
system, there were about 300 commands in /bin (which includes sections II, VI
and VIII), but a good chunk (I'd say probably 50 or more) were ones we'd
written. So there were not that many to start with (section II was maybe 3/4"
of paper), and you could read the UPM in a couple of hours. (I read through it
more than once; you'd get more retained, mentally, on each pass.)
There were no Unix people at all in the group at MIT which I joined, so I
couldn't ask around; there were a bunch in another group on the floor below,
although I didn't use them much - mostly it was RTFM.
Mailing lists? Books? Fuhgeddaboutit!
My next step in learning the kernel was to start reading the sources. (I
didn't have access to Lyons.) I did an 'cref' of the entire system, and
transferred the results to a large piece of paper, so I could see who was
calling who in the kernel.
> What were your goto resources? More than just man and the sources?
That's all there was!
I should point out that reading the sources to command 'x' taught you more
than just how 'x' worked - you saw how people interacted with the kernel, what
it could do, etc, etc.
Noel
> I'd been moving in this direction for a while
Now that I think about it, I may have subconciously absorbed this from Ken's
and Dennis' code in the V6 kernel. Go take a look at it: more than one level
of indent is quite rare in anything there (including tty.c, which has some
pretty complex stuff in it).
I don't know if this was subconcious but deliberate, or concious, or just
something they did for other reasons (e.g. typing long lines took too long on
their TTY's :-), but it's very noticeable, and consistent.
It interesting that both seem to have had the same style; tty.c is in dmr/, so
presumably Dennis', but the stuff in ken/ is the same way.
Oh, one other trick for simplifying code structure (which I noticed looking
through the V6 code a bit - this was something they _didn't_ do in one place,
which I would have done): if you have
if (<condition>) {
<some complex piece of code>
}
<implicit return>
}
invert it:
if (<!condition>)
return;
<some complex piece of code>
}
That gets that whole complex piece of code down one level of nesting.
Noel
I sometimes put the following in shell scripts at the beginning
> /tmp/foo
2>/tmp/foo_err
Drives some folks up the wall because they don’t get it.
David
> On Nov 8, 2017, at 3:21 PM, tuhs-request(a)minnie.tuhs.org wrote:
>
> From: Dave Horsfall <dave(a)horsfall.org>
> To: The Eunuchs Hysterical Society <tuhs(a)tuhs.org>
> Subject: Re: [TUHS] pre-more pager?
> Message-ID: <alpine.BSF.2.21.1711091019480.4766(a)aneurin.horsfall.org>
> Content-Type: text/plain; charset=US-ASCII; format=flowed
>
> On Wed, 8 Nov 2017, Arthur Krewat wrote:
>
>> head -20 < filename
>
> Or if you really want to confuse the newbies:
>
> < filename head -20
I am trying to find a paper. It was written at Bell Labs,
I thought by Bill Cheswick (though I cannot find it in his name),
entitled something like:
"A hacker caught, and examined"
A description of how a hacker got into Bell labs, and was quarintined on
a single workstation whilst the staff watched what they did.
Does this ring any bells? Anyone have a link?
I know about the Cuckoo's egg, but this was a paper, in troff and -ms macros
as I remember, not a book.
Thanks,
-Steve