> From: Bakul Shah
> one dealt with it by formatting the disk so that the logical blocks N &
> N+1 (from the OS PoV) were physically more than 1 sector apart. No
> clever coding needed!
An old hack. ('Nothing new', and all that.) DEC Rx01/02 floppies used the
same thing, circa 1976.
Noel
After my posting on Sat, 10 Dec 2022 17:42:14 -0700 about the recent
work on kermit 10.0, some readers asked why a serial line connection
and file transfer tool was still of interest, and a few others
responded with use cases.
Modern kermit has for several years supported ssh connections, and
Unicode, as well: here is a top-level command list:
% kermit
(~/) C-Kermit>? Command, one of the following:
add define hangup msleep resend telnet
answer delete HELP open return touch
apc dial if orientation rlogin trace
array directory increment output rmdir translate
ask disable input pause run transmit
askq do INTRO pdial screen type
assign echo kcd pipe script undeclare
associate edit learn print send undefine
back enable LICENSE pty server version
browse end lineout purge set void
bye evaluate log push shift wait
cd exit login pwd show where
change file logout quit space while
check finish lookup read ssh who
chmod for mail receive statistics write
clear ftp manual redial status xecho
close get message redirect stop xmessage
connect getc minput redo SUPPORT
convert getok mget reget suspend
copy goto mkdir remote switch
date grep mmove remove tail
decrement head msend rename take
or one of the tokens: ! # ( . ; : < @ ^ {
Here are the descriptions of connection and character set translations:
(~/) C-Kermit>help ssh
Syntax: SSH [ options ] <hostname> [ command ]
Makes an SSH connection using the external ssh program via the SET SSH
COMMAND string, which is "ssh -e none" by default. Options for the
external ssh program may be included. If the hostname is followed by a
command, the command is executed on the host instead of an interactive
shell.
(~/) C-Kermit>help connect
Syntax: CONNECT (or C, or CQ) [ switches ]
Connect to a remote computer via the serial communications device given in
the most recent SET LINE command, or to the network host named in the most
recent SET HOST command. Type the escape character followed by C to get
back to the C-Kermit prompt, or followed by ? for a list of CONNECT-mode
escape commands.
Include the /QUIETLY switch to suppress the informational message that
tells you how to escape back, etc. CQ is a synonym for CONNECT /QUIETLY.
Other switches include:
/TRIGGER:string
One or more strings to look for that will cause automatic return to
command mode. To specify one string, just put it right after the
colon, e.g. "/TRIGGER:Goodbye". If the string contains any spaces, you
must enclose it in braces, e.g. "/TRIGGER:{READY TO SEND...}". To
specify more than one trigger, use the following format:
/TRIGGER:{{string1}{string2}...{stringn}}
Upon return from CONNECT mode, the variable \v(trigger) is set to the
trigger string, if any, that was actually encountered. This value, like
all other CONNECT switches applies only to the CONNECT command with which
it is given, and overrides (temporarily) any global SET TERMINAL TRIGGER
string that might be in effect.
Your escape character is Ctrl-\ (ASCII 28, FS)
(~/) C-Kermit>help translate
Syntax: CONVERT file1 cs1 cs2 [ file2 ]
Synonym: TRANSLATE
Converts file1 from the character set cs1 into the character set cs2
and stores the result in file2. The character sets can be any of
C-Kermit's file character sets. If file2 is omitted, the translation
is displayed on the screen. An appropriate intermediate character-set
is chosen automatically, if necessary. Synonym: XLATE. Example:
CONVERT lasagna.txt latin1 utf8 lasagna-utf8.txt
Multiple files can be translated if file2 is a directory or device name,
rather than a filename, or if file2 is omitted.
-------------------------------------------------------------------------------
- Nelson H. F. Beebe Tel: +1 801 581 5254 -
- University of Utah -
- Department of Mathematics, 110 LCB Internet e-mail: beebe(a)math.utah.edu -
- 155 S 1400 E RM 233 beebe(a)acm.org beebe(a)computer.org -
- Salt Lake City, UT 84112-0090, USA URL: http://www.math.utah.edu/~beebe/ -
-------------------------------------------------------------------------------
Clem Cole mentions kermit in connection with the question raised about
the uses of the cu utility.
As an FYI, Kermit's author, Frank da Cruz, is preparing a final
release, version 10.0, and I've been working with him on testing
builds in numerous environments. There are frequent updates during
this work, and the latest snapshots can be found at
https://kermitproject.org/ftp/kermit/pretest/
The x-YYYYMMDD.* bundles do not contain a leading directory, so be
careful to unpack them in an empty directory. The build relies on a
lengthy makefile with platform-specific target names, like irix65,
linux, and solaris11: the leading comments in the makefile provide
further guidance.
-------------------------------------------------------------------------------
- Nelson H. F. Beebe Tel: +1 801 581 5254 -
- University of Utah -
- Department of Mathematics, 110 LCB Internet e-mail: beebe(a)math.utah.edu -
- 155 S 1400 E RM 233 beebe(a)acm.org beebe(a)computer.org -
- Salt Lake City, UT 84112-0090, USA URL: http://www.math.utah.edu/~beebe/ -
-------------------------------------------------------------------------------
Good day all, this may be COFF instead, but I'm not joined over there yet, might need some Warren help/approval.
In any case, received that 3B20S 4.1 manual in good shape, unpacked it, and out fell a little tri-fold titled "The Office Automation System (OAS) Editor-Formatter (ef) Reference Card" emblazoned with the usual Bell Laboratories, non-disclosure note abut the Bell System, and a nice little picture of a terminal I can't identify as well as the full manual for this OAS leaning against it: "The Office Automation System (OAS)" with a nice big Bell logo at the bottom of the spine.
The latter is likely a manual I spotted in a video once and couldn't make out the name/title at the time, thought I was seeing another long-lost UNIX manual. I've never heard of this before, and Google isn't turning up much as Office Automation System appears to be a general industry term.
I seem to recall hearing about ef itself once or twice, some sort of pre-vi screen editor from Bell methinks? Not super familiar with it though, I just seem to recall reading about that before somewhere.
Anywho, dealing with a move in the near future that is hopefully into a home I own, so pretty distracted from that scanning I keep talking about, but hopefully when I'm settled in in a few months I can setup a proper scan bench in my new place and really go to town on things.
- Matt G.
Exciting development in the process of finding lost documentation, just sealed this one on eBay: https://www.ebay.com/itm/385266550881?mkcid=16&mkevt=1&mkrid=711-127632-235…
After the link is a (now closed) auction for a Western Electric 3B20S UNIX User's Manual Release 4.1, something I thought I'd never see and wasn't sure actually exited: print manuals for 4.x.
Once received I'll be curious to see what differences are obvious between this and the 3.0 manual, and this should be easy to scan given the comb binding. What a nice cover too! I always expected if a 4.x manual of some kind popped up it would feature the falling blocks motif of the two starter package sets of technical reports, but the picture of a 3B20S is nice. How auspicious given the recent discussion on the 3B series. I'm particularly curious to see what makes it specifically a 3B20S manual, if that's referring to it only having commands relevant to that one or omitting any commands/info specific to DEC machines.
Either way, exciting times, this is one of those things that I had originally set out to even verify existed when I first started really studying the history of UNIX documentation, so it's vindicating to have found something floating around out there in the wild. Between this and the 4.0 docs we now should have a much clearer picture of that gulf between III and V.
More to come once I receive it!
- Matt G.
I finally got myself a decent scanner, and have scanned my most prized
relic from my summer at Bell Labs - Draft 1 of Kernighan and Ritchie's "The
C Programming Language".
It's early enough that there are no tables of contents or index; of
particular note is that "chapter 8" is a "C Reference Manual" by Dennis
dated May 1, 1977.
This dates from approx July 1977; it has my name on the cover and various
scribbles pointing out typos throughout.
Enjoy!
https://drive.google.com/drive/folders/1OvgKikM8vpZGxNzCjt4BM1ggBX0dlr-y?us…
p.s. I used a Fujitsu FI-8170 scanner, VueScan on Ubuntu, and pdftk-java
to merge front and back pages.
(Recently I mentioned to Doug McIlroy that I had infiltrated IBM East
Fishkill, reputedly one of the largest semiconductor fabs in the world,
with UNIX back in the 1980s. He suggested that I write it up and share it
here, so here it is.)
In 1986 I was working at IBM Research in Yorktown Heights, New York. I had
rejoined in 1984 after completing my PhD in computer science at CMU.
One day I got a phone call from Rick Dill. Rick, a distinguished physicist
who had, among other things, invented a technique that was key to
economically fabricating semiconductor lasers, had been my first boss at
IBM Research. While I’d been in Pittsburgh he had taken an assignment at
IBM’s big semiconductor fab up in East Fishkill. He was working to make
production processes there more efficient. He was about to initiate a
major project, with a large capital cost, that involved deploying a bunch
of computers and he wanted a certified computer scientist at the project
review. He invited me to drive up to Fishkill, about half an hour north of
the research lab, to attend a meeting. I agreed.
At the meeting I learned several things. First of all, the chipmaking
process involved many steps - perhaps fifty or sixty for each wafer full of
chips. The processing steps individually were expensive, and the amount
spent on each wafer was substantial. Because processing was imperfect, it
was imperative to check the results every few steps to make sure everything
was OK. Each wafer included a number of test articles, landing points for
test probes, scattered around the surface. Measurements of these test
articles were carried out on a special piece of equipment, I think bought
from Fairchild Semiconductor. It would take in a boat of wafers (identical
wafers were grouped together on special ceramic holders called boats, for
automatic handling, and all processed identically) and feed each wafer to
the test station, and probe each test article in turn. The result was
about a megabyte of data covering all of the wafers in the boat.
At this point the data had to be analyzed. The analysis program comprised
an interpreter called TAHOE along with a test program, one for each
different wafer being fabricated. The results indicated whether the wafers
in the boat were good, needed some rework, or had to be discarded.
These were the days before local area networking at IBM, so getting the
data from the test machine to the mainframe for analysis involved numerous
manual steps and took about six hours. To improve quality control, each
boat of wafers was only worked during a single eight-hour shift, so getting
the test results generally meant a 24 hour pause in the processing of the
boat, even though the analysis only took a couple of seconds of time on the
mainframe.
IBM had recently released a physically small mainframe based on customized
CPU chips from Motorola. This machine, the size of a large suitcase and
priced at about a million dollars, was suitable to locate next to each test
machine, thus eliminating the six hour wait to see results.
Because there were something like 50 of the big test machines at the
Fishkill site, project represented a major capital expenditure. Getting
funding of this size approved would take six to twelve months, and this
meeting was the first step in seeking this approval.
At the end of the meeting I asked for a copy of the manual for the TAHOE
test language. Someone gave me a copy and I took it home over the weekend
and read through it.
The following Monday I called Rick up and told him that I thought I could
implement an interpreter for the TAHOE language in about a month of work.
That was a tiny enough investment that Rick simply wrote a letter to Ralph
Gomory, then head of IBM Research, to requisition me for a month. I told
the Fishkill folks that I needed a UNIX machine to do this work and they
procured an RT PC running AIX 1. AIX 1 was based on System V. The
critical thing to me was that it had lex, yacc, vi, and make.
They set me up in an empty lab room with the machine and a work table.
Relatively quickly I built a lexical analyzer for the language in lex and
got an approximation to the grammar for the TAHOE language working in
yacc. The rest was implementing the functions for each of the TAHOE
primitives.
I adopted rigorous test automation early, a practice people now call test
driven development. Each time I added a capability to the interpreter I
wrote a scrap of TAHOE code to test it along with a piece of reference
input. I created a test target in the testing Makefile that ran the
interpreter with the test program and the reference input. There were four
directories, one for test scripts, one for input data, one for expected
outputs, and one for actual outputs. There was a big make file that had a
target for each test. Running all of the tests was simply a matter of
typing ‘make test’ in the root of the testing tree. Only if all of the
tests succeeded would I consider a build acceptable.
As I developed the interpreter I learned to build tests also for bugs as I
encountered them. This was because I discovered that I would occasionally
reintroduce bugs, so these tests, with the same structure (test scrap,
input data, reference output, make target) were very useful at catching
backsliding before it got away from me.
After a while I had implemented the entire TAHOE language. I named the
interpreter MONO after looking at the maps of the area near Lake Tahoe and
seeing Mono Lake, a small lake nearby.
Lake Tahoe and Mono Lake, with walking routes between them. Source: Google
Maps
At this point I asked my handler at Fishkill for a set of real input data,
a real test program, and a real set of output data. He got me the files
and I set to work.
The only tricky bit at this stage was the difference in floating point
between the RT PC machine, which used the recently adopted IEEE 754
floating point standard and the idiosyncratic floating point implemented in
the System 370 mainframes at the time. The problem was that the LSB
rounding rules were different in the two machines, resulting in mismatches
in results. These mismatches were way below the resolution of the actual
data, but deciding how to handle this was tricky.
At this point I had an interpreter, MONO, for the TAHOE language that took
one specific TAHOE program, some real data, and produced output that
matched the TAHOE output. Almost done.
I asked my handler, a lovely guy whose name I am ashamed I do not remember,
to get me the regression test suite for TAHOE. He took me over and
introduced me to the woman who managed the team that was developing and
maintaining the TAHOE interpreter. The TAHOE interpreter had been under
development, I gathered, for about 25 years and was a large amount of
assembler code. I asked her for the regression test suite for the TAHOE
interpreter. She did not recognize the term, but I was not dismayed - IBM
had their own names for everything (disk was DASD and a boot program was
IPL) and I figured it would be Polka Dot or something equally evocative. I
described what my regression test suite did and her face lit up. “What a
great idea!” she exclaimed.
Anyway, at that point I handed the interpreter code over to the Fishkill
organization. C compilers were available for the PC by that time, so they
were able to deploy it on PC-AT machines that they located at each testing
machine. Since a PC-AT could be had for about $5,000 in those days the
savings from the original proposal was about $50 million and about a year
of elapsed time. The analysis of a boat’s worth of data on the PC-AT took
perhaps a minute or two, so quite a bit slower than on the mainframe, but
the elimination of the six-hour delay meant that a boat could progress
forward in its processing on the same day rather than a day later.
One of my final conversations with my Fishkill handler was about getting
them some UNIX training. In those days the only way to get UNIX training
was from AT&T. Doing business with AT&T at IBM in those days involved very
high-level approvals - I think it required either the CEO or a direct
report to the CEO. He showed me the form he needed to get approved in
order to take this course, priced at about $1,500 at the time. It required
twelve signatures. When I expressed horror he noted that I shouldn’t worry
because the first six were based in the building we were standing in.
That’s when I began to grasp how big IBM was in those days.
Anyway, about five years later I left IBM. Just before I resigned the
Fishkill folks invited me up to attend a celebratory dinner. Awards were
given to many people involved in the project, including me. I learned that
there was now a department of more than 30 people dedicated to maintaining
the program that had taken me a month to build. Rick Dill noted that one
of the side benefits of the approach that I had taken was the production of
a formal grammar for the TAHOE language.
At one point near the end of the project I had a long reflective
conversation with my Fishkill minder. He spun a metaphor about what I had
done with this project. Roughly speaking, he said, “We were a bunch of
guys cutting down trees by beating on them with stones. We heard that
there was this thing called an axe, and someone sent a guy we thought would
show us how to cut down trees with an axe. Imagine our surprise when he
whipped out a chainsaw.”
=====
nygeek.netmindthegapdialogs.com/home <https://www.mindthegapdialogs.com/home>
All, thank you all for all the congratulations! I was going to pen an e-mail
to the list last night but, after a few celebratory glasses of wine, I demurred.
It still feels weird that Usenix chose me for the Flame award, given such
greats as Doug, Margo, Radia and others have previously received the
same award. In reality, the award belongs to every TUHS member who has
contributed documents, source code, tape images, anecdotes, knowledge
and wisdom, and who have given their time and energy to help others
with problems. I've been a steward of a remarkable community over three
decades and I feel honoured and humbled to receive recognition for it.
Casey told me the names of the people who nominated me. Thank you for
putting my name forward. Getting the e-mail from Casey sure was a surprise :-)
https://www.tuhs.org/Images/flame.jpg
Many thanks for all your support over the years!
Warren