On Wed, Dec 4, 2024 at 10:18 PM John Levine <johnl(a)taugh.com> wrote:
It appears that Marc Donner
<marc.donner(a)gmail.com> said:
With the notion of pipes it became possible to
operate on data quickly and
flexibly. There was nothing new from a fundamental capability point of
view, but the ease with which one could construct pipelines enabled rapid
experimentation and encouraged the development of pipe-able components to
add to the tool set.
Pipes were invented at least three times I'm aware of, but what made them
work so well in Unix is that they looked to the program the same as a file
so any program could use them for input or output without special arrangements,
and the shell made it easy to start two programs and pipe them together.
Once you have coroutines and queues for passing data between them, a
lot of things start to look like pipes.
The Dartmouth Time-Sharing System in the late 1960s
had communication files
which were essentially two-way pipes, but they were asymmetrical. One end, the
slave end, looked like a file, but the other end, the master end, was different
and the program had to know it was a com file. They were mostly used to pass
terminal I/O between user programs at the slave end and SIMON at the master end,
the terminal monitor that talked to the front end computer than ran the TTYs.
Doug has written at some length on this list about communication files
and their homomorphism to the way Plan 9 presented and handled
resources.
They were invented again at IBM in the 1970s and
described in this paper. I wrote
them a letter, which they published, saying that Unix pipes did the same thing.
https://dl.acm.org/doi/10.1147/sj.174.0383
Don't forget CMS pipelines, too!
Sadly, the Morrison paper cited above is not easily accessible, though
I acquired a copy from IEEE; perhaps a sucker's game as it was not
cheap. Your subsequent letter, and part of Morrison's response to you,
however, is available, gratis.
Reading through Morrison, one gets the impression that there are some
substantial differences with Unix pipelines; or rather, the way that
Unix pipelines are usually used. In particular, he describes his
linked streams in terms of a "network", by which he appears to mean an
arbitrary directed graph. Crucially, he describes combining nodes, for
merging data from multiple streams.
Unix pipelines, on the other hand, tend to be used in a manner that is
strictly linear, without the fan-out and fan-in capabilities described
by Morrison. Of course, nothing prevents one from building a
Morrison-style "network" from Unix processes and pipes, though it's
hard to see how that would work without something like `select`, which
didn't yet exist in 1978. Regardless, Unix still doesn't expose a
particularly convenient syntax for expressing these sorts of
constructions at the shell.
As an aside, Morrison has a web page dedicated to "flow-based
programming", that he claims to have invented in the late 1960s. That
seems like a bit of a tall claim, and I'd wager Doug gives him a run
for his money on that.
https://jpaulm.github.io/fbp/
- Dan C.