> > p) Remove the '-a' option (the ASCII approximation output).
>
> I didn't even know this existed. Looking at what it spits out, I find
> myself wondering what good it is. Is this for Unix troff compatibility?
> For people who didn't even have glass TTYs and needed to imagine what
> the typeset output would look like?
Here's a classic use:
Since groff is not WYSIWYG, the experimental cycle is long:
edit - save - groff - view. In many cases that cycle can be
short-circuited by typing straight into groff -a.
doug
Call me old-fashioned, but I still think the papers in Volume 2
of the Seventh Edition manual are a good straightforward start.
There's a tutorial on troff, and papers introducing eqn, tbl,
and refer.
Norman Wilson
Toronto ON
Hi,
Can anyone point to an introduction to {t,r,g}roff / pic / tbl / etcetera?
I've respected them for years and with all the latest discussions about
them I'd like to try and learn something.
Any pointers would be appreciated.
--
Grant. . . .
unix || die
Steve Johnson:
I think of Alan Demer's comment: "There are two kinds of programming
languages, those that make it easy to write good programs, and those
that make it hard to write bad ones."
====
I'm (still) with Larry Flon on this one:
There does not now, nor will there ever, exist a programming language
in which it is the least bit hard to write bad programs.
-- SIGPLAN Notices, October 1975, p. 16.
There are certainly language that make it easier to avoid
trivial mistakes, like buffer overruns and pointer botches,
but the sort of nonsense Kernigan and Plaugher demonstrated
and discussed about the same time in The Elements of Programming
Style shows up in any language.
I'm afraid I see that nearly any time I look in source code.
To be fair, these days I rarely have the time to look at
someone else's source code unless it's buggy, but it is
nevertheless appalling what one finds in critical system
software like boot environments and authentication code.
There is no royal language to good programs. Programming
well takes discipline and skill and experience. Languages
like Pascal that prevent certain classes of sloppiness like
overrunning arrays and string buffers may be better for
teaching beginners, but only because that makes it easier
to focus on the real issues, such as how to structure a
program well and how to write clearly. I have yet to see
evidence that that actually happens.
Norman Wilson
Toronto ON
> From: Chris Torek
> termcap has entries for the number of NUL characters to insert after
> carriage return.
Actually, the stock terminal driver in V6 Unix (here:
http://minnie.tuhs.org/cgi-bin/utree.pl?file=V6/usr/sys/dmr/tty.c
if anyone wants to see the actual code; it's in ttyoutput()) had some pretty
complex code to do just the right amount of delay (in clock ticks) for a large
number of different motion contral characters (TAB, etc, in addition to LF and
CR), and then uses the system timer to delay that amount of real time after
sending such a character (see the very bottom of ttstart()).
E.g. for a NL, it used the fact that it 'knew' which column the print head was
in to calculate the exact return time.
Clever, but alas, it did this by sticking 'characters' in the buffered output
stream with the high bit set, and the delay required in the low 0177 bits,
which the output start routine interpreted; as the code drolly notes, "thus
(unfortunately) restricting the transmission path to 7 bits". Which was a real
PITA recently when I tried to download a binary file to an LSI-11 from a V6
running in Ersatz-11! I had to tweak the TTY driver to allow 8-bit output...
Noel
Seek and ye shall find.
Ask and ye shall receive.
Brian Kernighan was kind enough to find for me everyone's favorite
Computing Sceince Technical Report, CSTR 100, "Why Pascal is Not
My Favorite Programming Language".
Attached is the file and his macros. This will not immediately
format using groff etc.; I hope to create a version that will, sometime
in the next few weeks.
In the meantime, Warren, please add to the archives when you are able.
Enjoy!
Arnold
Hi everybody. I'm new to this list as a side-effect of my question about
the provenance of strcmp and the convention of returning <0, 0, >0.
I had to learn Pascal as a freshman in college which was challenging coming
from BTL knowing C. Kept wondering how Pascal could be used for anything
useful. The answer that I later saw in industry was "by adding non-standard
extensions".
Language discussions often turn to the issue of whether programming languages
should prevent programmers from making mistakes or whether that's the job of
the programmer. This is, of course, independent of discussing the
expressiveness of languages.
I agree that a lot of "programming" today consists of trusting and bolting
together random library functions. That's not me; I often work on safety
critical devices and I'm not going to rely on libraries of unknown provenance
when building a medical device that I make be hooked up to it someday.
Years ago I inherited a project written in hodgepodge of programming languages
including ruby. My first reaction to ruby was "Wow, how do I get some of
what they're smoking because it's better than anything I have?" I eventually
asked Ward Cunningham about it because he was working for ruby house AboutUs
at the time. His answer went something like this:
Jon, you're an engineer and you understand engineering.
You know programming and programmers and understand programming.
Then, there are the people with whom we entrust our confidential credit card data.
That's what ruby is for.
This nicely summarized the current state of affairs in which the most critical
tasks are assigned to the least competent people. I see this as a management,
business, and political problem which can't be solved by different languages.
I have often made the statement that "I would never hire someone who had to use
glibc in order to implement a singly-linked list." I get push-back such as "Oh,
and people can create bugs rather than using the debugged library?" to which I
glibly respond "debugged library like OpenSSL?"
I am more than a little terrified by the "everybody must learn to code in high
school movement". What they're learning is something at a level akin to the
ruby example above. The goal is clearly to make "coding" a minimum wage job
and to many the distinction between "coding" and engineering is lost. I've
spoken with many kids in the "future engineer" category who are frustrated at
the lack of depth in the curriculum. I'd summarize it as teaching people to
program without teaching them anything about computers.
Anyway, I have been volunteering to teach technology to kids for years as
karmic payback to my BTL explorer scout advisors Carl Christensen, Heinz
Lycklama, and Hans Lie. Not to mention all of the amazing people that I met
there when my dedication to hitchhiking up the the Labs after school and
talking people into signing me in turned into a series of summer jobs.
I'm in the process of turning my class notes into a book. The goal of the
book is to teach kids enough about computers that they can understand what
their code is actually doing and why one can write better code with an
understanding of the hardware and environment.
The book is in the editing phase so it's beyond wholesale changes. But I'm
curious as to what you all think should be in such a book as I'll find a way
to wedge in anything important that I missed.
Thanks,
Jon
And let's not forget Alan Perlis's:
"A language that doesn't affect the way you think about programming, is not worth knowing."
 https://en.m.wikiquote.org/wiki/Alan_Perlis
-------- Original message --------
From: Toby Thain
Date:02/09/2017 18:00 (GMT+02:00)
To: Dan Cross
Cc: The Eunuchs Hysterical Society , quad
Subject: Re: [TUHS] Why Pascal is Not My Favorite Programming Language - Unearthed!
...
Finally, a favourite quote:
"Programs must be written for people to read, and only incidentally
for machines to execute" -- Hal Abelson
https://twitter.com/old_sound/status/903919515884544000
--Toby
>
> - Dan C.
>
> From: "Jeremy C. Reed"
> I don't know the key for "v" but maybe means "very distant host"
Yes. See:
http://mercury.lcs.mit.edu/~jnc/tech/jpg/ARPANet/L77Dec.jpg
and look at the MIT-44 IMP (upper right center). It's listed as having a
PDP-11, with the /v, and that machine (LL-ASG, 1/44) was definitely on a VDH
(it was not in Tech Sq). (A VDH was basically an IMP modem interface
hardware-wise, but made to look like a host at a high level within the IMP
software.)
> He also told me the Unix v6 Arpanet code was from San Diego.
Err, he may have gotten it from San Diego, but they didn't write the code, it
was written at UIll. See:
http://minnie.tuhs.org/cgi-bin/utree.pl?file=SRI-NOSC
which contains a copy of the code, which came to me via NOSC in SD.
Noel
I did hear back from Lou Katz - user #1. Indeed the first version that escaped the labs was the 4th edition.
Sent from my PDP-7 Running UNIX V0 expect things to be almost but not quite.
> On Sep 1, 2017, at 7:16 PM, Clem cole <clemc(a)ccc.com> wrote:
>
> Interesting. If O'Malley had a connection wonder what it was connected too on both sides. It had to be to lbl but the Vdh was a piece of shit even in the ingvax days. The first version was even worse. On the ucb side I wonder. It would not have been Unix because UofI did the original arpanet code and that was for v6. There was never a pdp10 at ucb so I wonder if it was one of the CDC machines which were the primary systems until Unix came to be.
>
> Sent from my PDP-7 Running UNIX V0 expect things to be almost but not quite.
>
>>> On Sep 1, 2017, at 5:59 PM, Jeremy C. Reed <reed(a)reedmedia.net> wrote:
>>>
>>> On Fri, 1 Sep 2017, Clem Cole wrote:
>>>
>>> So it means that UCB was hacking privately without taking to Katz@ NYU, or
>>> the Columbia and Harvard folks for a while. I need to ask Lou what he
>>> remembers. UCB was not connected to the Arpanet at this point (Stanford
>>> was), so it's possible Ken's sabbatical openned up some channels that had
>>> not existed. [UCB does not get connected until ing70 gets the
>>> vdh-interface up the hill to LBL's IMP as part of the Ingress project and
>>> that was very late in the 70s - not long before I arrived].
>>
>> Allman told me that Mike O'Malley had an ARPA connection at UCB that was
>> axed a few years before the INGRES link. So yes, I think no Arpanet
>> connection during the early BSD development work. (Losing this
>> connection may have had some controversy, but I don't know the details.)
>>
>> Fabry told me that O'Malley used Unix for his (EECS) Artificial
>> Intelligence research projects before he discovered it (so before the
>> October 1973 Symposium).
>>
>> RFC 402 of Oct 1972 has a ARPA network participant Michael O'Malley of
>> University of Michigan Phonetics Laboratory. Also this draft report at
>> http://digitalcollections.library.cmu.edu/awweb/awarchive?type=file&item=35…
>> about the ARPA speech recognition project lists M. H. O'Malley at UCB
>> and says the principle investigator from Univ. of Michigan moved to UCB.
>> (I never go ahold of him to see if had any other relevance to my BSD
>> story.)
>>
>>