> But a note on Dijkstra's algorithm: Moore and Dijsktra both published
> in 1959.
I was off by one on the year, but the sign of the error is debatable.
Moore's paper was presented in a conference held in early April, 1957,
proceedings from which were not issued until 1959. I learned about it
from Moore when I first I met him, in 1958. Then, he described the
algorithm in vivid, instantly understandable terms: imagine a flood
spreading at uniform speed through the network and record the
distance to nodes in order of wetting.
> But it is documented Dijkstra's algorithm has been invented and used
> by him in 1956.
Taking into account the lead time for conference submissions, one
can confidently say that Moore devised the algorithm before 1957.
I do not know, though, when it first ran on a Bell Labs computer.
That said, Moore's paper, which presented the algorithm essentially
by example, was not nearly as clear as the capsule summary he gave
me. It seems amateurish by comparison with Dijkstra's elegant treatment.
Dijkstra's name has been attached to the method with good reason.
Doug
>> pipe, ch(e)root.... Any more unix connections to smoking?
I have a slide that's a quadralingual pun (French, English, Art, shell)
in which Magritte's painting of a pipe with the words "Ceci n'est pas
une pipe" has been altered to read "Ceci n'est pas une |". The
altered phrase was borrowed from Jay Misra et al, who used it as
an example of a message in a paper on communicating processes.
Many years ago (when the dinosaurs were using V6), I had a crazy idea[*]
that a write(fd, 0, NULL) would somehow signal EOF to the reader i.e. a
subsequent read would wait for further data instead of ENOTOBACCO.
Did any *nix ever implement that? I have no idea how it would be done.
[*]
I was full of crazy ideas then, such as extending stty() to an arbitrary
device and was told by the anti-CSU mob that it was a stupid idea...
--
Dave Horsfall DTM (VK2KFU) "Those who don't understand security will suffer."
So many memories. The "ultimate machine" (which was brought out and
demonstrated from time to time while I was at the Labs) was built in
collaboration with Ed Moore (he of Moore-model automata, who published
"Dijkstra's algorithm" for shortest paths a year before Dijkstra) and
(I believe) Dave Hagelbarger. Moore endowed the machine with a longevity
property seldom remarked on: majority logic so that any electrical
component can be removed without harming its observable behavior.
Shannon moved to MIT from Bell Labs some weeks before I moved the
other way, so I only met him much later when he visited the Unix room
(an excuse, albeit weak, for this distant detour from TUHS). By that
time Shannon was descending into Alzheimer's fog, but his wife who
accompanied him was a memorably curious and perceptive visitor. I have
wondered what role she may have played as a sounding board or more in
Shannon's research.
As a child, I used to ski on the 50-foot hill that was the lawn of the
mansion that Shannon would buy when he moved to Massachusetts. We kids
would ski down and climb back up. Not Shannon. He installed a chairlift.
One house separated mine from the ski hill. It belonged to John Trump,
another MIT prof who engineered the Van de Graaff generator into a
commercial product for generating million-volt x-rays and, yes, was uncle
of the Donald. John, as kind as he was bright, fortunately did not live
to see the apotheosis of his wayward nephew.
Doug
> We lost Claude Shannon on this day in 2001. He was a mathematician,
> electrical engineer, and cryptographer; he is regarded as the "father" of
> information theory, and he pioneered digital circuit design. Amongst
> other things he built a barbed-wire telegraph, the "Ultimate Machine" (it
> reached up and switched itself off), a Roman numeral computer ("THROBAC"),
> the Minivac 601 (a digital trainer), a Rubik's Cube solver, a mechanical
> mouse that learned how to solve mazes, and outlined a chess program
> (pre-Belle). He formulated the security mantra "The enemy knows the
> system", and did top-secret work in WW-2 on crypto and fire-control
> systems.
Never heard of Claude Shannon. So a good opportunity to do some
searching reading to 'catch up'.
Interesting person and this quota tends to make him my type of guy
"I just wondered how things were put together.
– C.E. Shannon"
http://themathpath.com/documents/ee376a_win0809_aboutClaudeShannon_willywu.…
Now wondering if I should register for this THORIAC project or just
read some more and do it. Not in the mood for learning Python I'd
probably do some fumbling in C.
https://www.engage-csedu.org/find-resources/shannons-throbac
Keeps me busy and amuzed,
uncle rubl
Just curious; am I the only who, back in the early days of V6, used pipes
as temporary files? I mean that after calling pipe(), instead of then
forking and playing "file-descriptor footsie" you just read and wrote
within the same process.
I seem to recall that it worked, as long as you avoided the 8-block limit
(or whatever it was then); I have no idea why I tried it, other than to
see if it worked i.e. avoid the creat() (without the "e") etc.
--
Dave Horsfall DTM (VK2KFU) "Those who don't understand security will suffer."
On 2/20/18, Donald ODona <mutiny.mutiny(a)india.com> wrote:
> since '86 he was working on an operating system, named Mica, which failed.
>
> At 19 Feb 2018 18:13:59 +0000 (+00:00) from Paul Winalski
> <paul.winalski(a)gmail.com>:
>> Dave Cutler was in the VMS group only for VMS version 1. He rarely
>> stayed on around for version 2 of anything. Hustvedt's and Lipman's
>> contributions for VMS were more extensive and longer-lasting than
>> Cutler's.
Cutler had already left the VMS OS group by the time I joined DEC in
February of 1980. After VMS he led the team developing PL/I and C
compilers for VMS. These shared a common back end called the VAX Code
Generator (VCG). The other VMS compilers at the time (Fortran,
Pascal, Cobol) had their own separate optimizers and code generators.
The VAX Ada compiler would also use VCG.
When version 1 of VAX PL/i and VAX C shipped, Cutler worked on
subsetting the VAX architecture so that a single chip implementation
could be done, and led the team that produced the MicroVAX I. The
MicroVAX architecture emulated expensive instructions such as packed
decimal. All of the later, single-chip VAXes used this architecture.
When the MicroVAX I shipped, Cutler devised a microkernel-based
real-time operating system for the VAX called VAXeln.
After VAXeln, Cutler led the team developing a RISC architecture
follow-on to the VAX called PRISM, and an operating system for it
called Mica. Mica had a VAXeln-like microkernel, and the plan was to
layer personality modules on top of that to implement VMS and
Unix-style ABIs and system calls.
Alpha was chosen instead of PRISM as the VAX successor, and that is
when Cutler left DEC for Microsoft. Windows NT has a lot of design
concepts and details previously seen in PRISM and VMS.
-Paul W.
At Rutgers Newark, we had VMS system that had Whitesmith's C on it. At one point, Whitesmiths decided to "fight piracy" by sending you a sticker you were supposed to stick on the front of your computer to show that you had a licensed copy. I suppose I might have been in trouble if the Whitesmiths police came to my machine room. I was a bit miffed when one of the other employees actually stuck the thing to the machine.
Years later I was loosely affiliated with Unipress. I did some consulting for them when I was between jobs. I went out to dinner with their principal, a man named Mark Krieger. After a bit of conversation it occurred to me. "Didn't you get booed off the stage at the University of Delaware UNIX users group meeting." He admitted he had, he was half of Whitesmiths with Paul Plauger. It then came back to me about Idris and the software stamps. I mentioned the stamps and he said he was gone by then but that was his sign that Plauger had gone over the edge. I carefully peeled our sticker off the Vax and gave it to him the next time I saw him.
Let's see how much thread-drift I can generate this time...
Dick Hustvedt was born on this day in 1946; an architect of RSX-11 and
VMS, he also had a weird sense of humour which he demonstrated by
enshrining the "microfortnight" into VMS.
Sadly, we lost him in a car accident in 2008.
--
Dave Horsfall DTM (VK2KFU) "Those who don't understand security will suffer."
I've send a couple of you private messages with some more details of why I
ask this, but I'll bring the large question to debate here:
Have POSIX and
LSB lost
their
usefulness/relevance? If so, we know ISV’s like Ansys are not going to go
‘FOSS’ and make their sources available (ignore religious beliefs, it just
is not their business model); how to we get that level of precision to
allow
the part of the
market
that will be 'binary only' continue to
create applications?
Seriously, please try to stay away from religion on this
question. Clearly, there are a large number of ISVs have traditionally
used interface specifications. To me it started with things like the old
Cobol and Fortran standards for the languages. That was not good enough
since the systems diverge, and /usr/group then IEEE/ANSI/ISO did Posix.
Clearly, Posix enabled Unix implementations such a Linux to shine, although
Linux does not doggedly follow it. Apple was once Posix conformant, but
I'd not think they worry to much about it. Linux created LSB, but I see
fewer and fewer references to it.
I worry that without a real binary definition, it's darned hard (at least
in the higher end of the business that I live day-to-day) to get ISV's to
care.
What do you folks think?
Clem