Moving to COFF as this is less UNIX and more computer architecture and
design style...
On Tue, Nov 30, 2021 at 3:07 AM <pbirkel(a)gmail.com> wrote:
> Given a random logic design it's efficient to organize the ISA encoding
> to maximize its regularity.
Probably also of some benefit to compilers in a memory-constrained
> environment?
>
To be honest, I think that the regularity of the instruction set is less
for the logic and more for the compiler. Around the time the 11 was being
created Bill Wulf and I think Gordan as co-author, wrote a paper about how
instruction set design, the regularity, lack of special cases, made it
easier to write a code optimizer. Remember a couple of former Bell and
Wulf students were heavily involved in the 11 (Strecker being the main one
I can think of off the top of my head).
Also remember that Gordan and the CMU types of those days were beginning to
create what we now call Hardware Description Languages (HDL). Gordon
describes in "Bell and Newell" (the definitive Computer Structures book of
the 1970s) his Processor-Memory-Switch (PMS) diagrams. The original 11
(which would become the 11/20) was first described as a set of PMS
diagrams. PMS of course, beget the Instruction Set Processor Language
(ISPL) that Mario created a couple of years later. While ISPL was after
the 11 had been designed, ISPL could synthesize a system using PDP-16 RTM
modules. A later version from our old friend from UNIX land, Ted Kowalski
[his PhD thesis actually], that could spit out TTL from the later ISPS
simulator and compiler [the S being simulation]. ISPS would beget VHDL,
which beget today Verilog/System Verilog.
IIRC it was a lecture Gordon Gordan gave us WRT to microcode *vs.* direct
logic. He offered that microcode had the advantage that you could more
easily update things in the field, but he also felt that if we could catch
the errors before you released the HW to the world, and if we could then
directly synthesize, that would be even better - no errors/no need to
update. That said, by the 11/40, DEC started to microcode the 11's,
although as you point out the 11/34 and later 11/44, where more direct
logic than the 11/40 - and of course Wulf would created the 11/40e - which
writeable control store so they add some instructions and eventually build
C.mmp.
Over to COFF...
On 2021-11-23 02:57, Henry Bent wrote:
> On Mon, 22 Nov 2021 at 21:31, Mary Ann Horton <mah(a)mhorton.net
> <mailto:mah@mhorton.net>> wrote:
>
> PL/I was my favorite mainframe programming language my last two
> years as
> an undergrad. I liked how it incorporated ideas from FORTRAN,
> ALGOL, and
> COBOL. My student job was to enhance a PL/I package for a History
> professor.
>
>
> What language were the PL/I compilers written in?
From AFIPS '69 (Fall): "The Multics compiler is the only PL/1 compiler
written in PL/1 [...]"
HOPL I has a talk on the early history of PL/1 (born as NPL) but nothing
on the question.
N.
>
> Wikipedia claims that IBM is still developing a PL/I compiler, which I
> suppose I have no reason to disbelieve, but I'm very curious as to who
> is using it and for what purpose.
>
> -Henry
Moving to COFF where this probably belongs because its less UNIX and more
PL oriented.
On Tue, Nov 23, 2021 at 3:00 AM Henry Bent <henry.r.bent(a)gmail.com> wrote:
> What language were the PL/I compilers written in?
>
I don't know about anyone else, but the VAX PL/1 front-end was bought by
DEC from Freiburghouse (??SP??) in Framingham, MA. It was written in PL/1
on a Multics system. The Front-end was the same one that Pr1me used
although Pr1me also bought their Fortran, which DEC did not. [FWIW: The
DEC/Intel Fortran Front-End was written in Pascal -- still is last time I
talked to the compiler folks].
I do not know what the Freiburghouse folks used for a compiler-compiler
(Steve or Doug might ), but >>I think<< it might not have used one.
Culter famously led the new backend for it and had to shuttle tapes from
MIT to ZKO in Nashua during the development. The backend was written in a
combination of PL/1, BLISS32 and Assembler. Once the compiler could self
host, everything moved to ZKO.
That compiler originally targeted VMS, but was moved to Unix/VAX at one
point as someone else pointed out.
When the new GEM compilers were about 10-15 years later, I was under the
impressions that the original Freiburghouse/Culter hacked front-end was
reworked to use the GEM backend system, as GEM used BLISS, and C for the
runtimes and a small amount of Assembler as needed for each ISA [And I
believe it continues to be the same from VSI folks today]. GEM based PL/1
was released on Alpha when I was still at DEC, and I believe that it was
released for Itanium a few years later [by Intel under contract to
Compaq/HP]. VSI has built a GEM based Intel*64 and is releasing/has
released VMS for same using it; I would suspect they moved PL/1 over also
[Their target customer is the traditional DEC VMS customer that still has
active applications and wants to run them on modern HW]. I'll have to ask
one of my former coworkers, who at one point was and I still think is, the
main compiler guy at VSI/resident GEM expert.
> Wikipedia claims that IBM is still developing a PL/I compiler, which I
> suppose I have no reason to disbelieve, but I'm very curious as to who is
> using it and for what purpose.
>
As best I can tell, commercial sites still use it for traditional code,
just like Cobol. It's interesting, Intel does neither but we spend a ton of
money on Fortran because so much development (both old and new) in the
scientific community requires it. I answered why elsewhere in more
detail: Where
is Fortran used these days
<https://www.quora.com/Where-is-Fortran-used-these-days/answers/87679712>
and Is Fortran still alive
<https://www.quora.com/Is-Fortran-still-alive/answer/Clem-Cole>
My >>guess<< is that PL/1 is suffering the same fate as Cobol, and fading
because the apps are being/have been slowly rewritten from custom code to
using COTS solutions from folks like Oracle, SAS, BAAN and the like. Not
so for Fortran and the reason is that the math has not changed. The core
of these codes is the same was it was in the 1960s/70s when they were
written. A friend of mine used to be the Chief Metallurgist for the US Gov
at NIST and as Dr. Fek put it so well: * "I have over 60 years worth of
data that we have classified and we understand what it is telling us. If
you magically gave me new code to do the same thing as what we do with our
processes that we have developed over the years, I would have to reclassify
all that data. It's just not economically interesting." *I personally
equate it to the QWERTY keyboard. Just not going to change. *i.e.* *"Simple
economics always beats sophisticated architecture."*
[-TUHS, +COFF]
On Tue, Nov 23, 2021 at 3:00 AM Henry Bent <henry.r.bent(a)gmail.com> wrote:
> On Mon, 22 Nov 2021 at 21:31, Mary Ann Horton <mah(a)mhorton.net> wrote:
>
>> PL/I was my favorite mainframe programming language my last two years as
>> an undergrad. I liked how it incorporated ideas from FORTRAN, ALGOL, and
>> COBOL. My student job was to enhance a PL/I package for a History
>> professor.
>>
>
> What language were the PL/I compilers written in?
>
The only PL/I compiler I have access to is, somewhat ironically, the
Multics PL/1 compiler. It is largely self-hosting; more details can be
found here: https://multicians.org/pl1.html (Note Doug's name appears
prominently.)
Wikipedia claims that IBM is still developing a PL/I compiler, which I
> suppose I have no reason to disbelieve, but I'm very curious as to who is
> using it and for what purpose.
>
I imagine most of it is legacy code in a mainframe environment, similarly
to COBOL. I can't imagine that many folks are considering new development
in PL/1 other than in retro/hobbyist environments and some mainframe shops
where there's a heavy existing PL/I investment.
- Dan C.
I recently had a discussion with some colleagues on the topic of
shells. Two people whom I respect both told me that Microsoft's
Powershell runs rings round the Bourne shell.
Somehow that sounds like anathema to me, but it's not beyond the
bounds of possibility. Before I waste time investigating, can anybody
here give me some insights?
Greg
--
Sent from my desktop computer.
Finger grog(a)lemis.com for PGP public key.
See complete headers for address and phone numbers.
This message is digitally signed. If your Microsoft mail program
reports problems, please read http://lemis.com/broken-MUA.php
On Wed, Nov 17, 2021 at 3:24 PM Rob Pike <robpike(a)gmail.com> wrote:
> Perl certainly had its detractors, but for a few years there it was the
> lingua franca of system administration.
>
It's still what I reach for first when I need to write a state machine that
processes a file made up of lines with some--or some set of--structures.
The integration of regexps is far, far, far superior to what Python can do,
and I adore the while(<>) construct. Maintaining other people's Perl
usually sucks, but it's a very easy way to solve your own little problems.
Adam
On 2021-11-16 09:57, Douglas McIlroy wrote:
> The following remark stirred old memories. Apologies for straying off
> the path of TUHS.
>
>> I have gotten the impression that [PL/I] was a language that was beloved by no one.
> As I was a designer of PL/I, an implementer of EPL (the preliminary
> PL/I compiler used to build Multics), and author of the first PL/I
> program to appear in the ACM Collected Algorithms, it's a bit hard to
> admit that PL/I was "insignificant". I'm proud, though, of having
> conceived the SIGNAL statement, which pioneered exception handling,
> and the USES and SETS attributes, which unfortunately sank into
> oblivion. I also spurred Bud Lawson to invent -> for pointer-chasing.
> The former notation C(B(A)) became A->B->C. This was PL/I's gift to C.
>
> After the ACM program I never wrote another line of PL/I.
> Gratification finally came forty years on when I met a retired
> programmer who, unaware of my PL/I connection, volunteered that she
> had loved PL/I above all other programming languages.
My first language was actually PL/C (and the computer centre did not
charge for runs in PL/C). I needed to use PL/I for some thesis-related
work and ran into the JLC wall -- no issues with the former, many issues
with the latter. One of the support people, upon learning that I was
using PL/I, said: "PL/I's alright!"
N.
>
> Doug
Moving to COFF ...
On Tue, Nov 16, 2021 at 10:50 AM Adam Thornton <athornton(a)gmail.com> wrote:
> I'm not even sure how much of this you can lay at the feet of teachers: I
> would argue that we see a huge efflorescence of essentially self-taught
> programming cobbled together from (in the old days) the system manuals a
>
Ouch ... this is exactly my point. In my experience in ~55 years of
programming, with greater than 45 of those being paid to do it, the best
programmers I know and have worked with were taught/mentored by a master --
not self-taught. As I said, I had to be re-educated once I got the CMU.
My Dad had done the best he knew, but much of what he taught me was
shortcuts and tricks because that is what he knew 🠪 he taught me syntax,
not how to think. I know a lot of programmers (like myself) that were
self-taught or introduced to computing by novices to start and that
experience get them excited, but all of them had real teachers/mentors who
taught them the true art form and helped them unlearn a lot of crap that
they had picked up or miss-interpreted.
Looking at my father as a teacher, he really had never been taught to think
like a programmer. In the late 1950s he was a 'computer' [see the movie
"Hidden Figures"]. He was taught FORTRAN and BASIC and told to implement
things he had been doing by hand (solving differential equations using
linear algebra). The ideas we know and loved about structured
programming and* how to do this well* were still being invented by folks
like Doug and his sisters and brothers in the research community. It's no
surprise that my Dad taught me to 'hack' because he and I had nothing to
compare to. BTW: this is not to state all HS computer teachers are bad,
but the problem is that most people that are really good at programming are
actually quite rare and they tend to end up in research or industry -- not
teaching HS. Today, the typical HS computer teacher (like one of my
nieces) takes a course or two at UMASS in the teacher's college. They are
never taught to program or take the same courses the kids in science and
engineering take 🠪 BTW I also think this is why we see so much of the
popular press talking about 'coding' not programming. They really think
learning to program is learning the syntax of a specific programming
language.
When I look at the young people I hire (and mentor) told, it's not any
different. BTW: Jon and I had a little bit of a disagreement when he
wrote his book. He uses Javascript for a lot of his examples - because of
exactly what you point out 🠪 Javascript today, like BASIC before it, has a
very high "on-screen results" factor with little work by the user. Much
is being done behind the covers to make that magic happen. I tend to
believe that creates a false sense of knowledge/understanding.
To Jon's credit, he tries to bridge that in his book. As I said, I
thought I knew a lot more about computers until I got to CMU. Boy was I in
for an education. That said, I was lucky to be around some very smart
people who helped steer me.
Clem
ᐧ
>From TUHS (to Doug McIlroy):
"Curious what you think of APL"
I'm sure what Doug thinks of APL is unprintable. Unless, of course, he has
the special type ball.
<rimshot>
On Tue, Nov 16, 2021 at 8:23 AM Richard Salz <rich.salz(a)gmail.com> wrote:
>
> The former notation C(B(A)) became A->B->C. This was PL/I's gift to C.
>>
>
> You seem to have a gift for notation. That's rare. Curious what you think
> of APL?
>
Hi,
Will someone please explain the history and usage of gpasswd / newgrp / sg?
I've run across them again recently while reading old Unix texts. I've
been aware of them for years, but I've never actually used them for
anything beyond kicking the tires. So I figured that I'd inquire of the
hive mind that is TUHS / COFF.
When was the concept of group passwords introduced?
What was the problem that group passwords were the solution for?
How common was the use of group passwords?
I ran into one comment indicating that they used newgrp to work around a
limitation in the number of (secondary) groups in relation to an NFS
implementation. Specifically that the implementation of NFS they were
using didn't support more than 16 groups. So they would switch their
primary group to work around this limit.
Does anyone have any interesting stories related to group passwords /
gpasswd / newgrp / sg?
--
Grant. . . .
unix || die