All-in-one vs pipelined sorts brought to mind NSA's undeservedly obscure dataflow language, POGOL, https://doi.org/10.1145/512927.512948 (POPL 1973). In POGOL one wrote programs as collections of routines that communicated via named files, which the compiler did its best to optimize away. Often this amounted to loop jamming or to the distributive law for map over function composition. POGOL could, however, handle general dataflow programming including feedback loops.

One can imagine a program for pulling the POGOL trick on a shell pipeline. That could accomplish--at negligible cost--the conversion of a cheap demo into a genuine candidate for intensive production use.

This consideration spurs another thought. Despite Unix's claim to build tools to make tools, only a relativelly narrow scope of higher-order tools that take programs as dara ever arose. After the bootstrapping B, there were a number of compilers,  most notably C, plus  f77, bc, ratfor, and struct. A slight variant on the idea of compiling was the suite of troff preprocessors.

The shell also manipulates programs by composing them into larger programs.

Aside from such examples, only one other category of higher-order Unix program comes to mind: Peter Weinberger's lcomp for instrumenting C programs with instruction counts.

An offshoot of Unix were Gerard Holzmann's tools for extracting model-checker models from C programs. These saw use at Indian Hill and most notably at JPL, but never appeared among mainstream Unix offerings. Similar tools exist in-house at Microsoft and elsewhere. But generally speaking we have vey few kinds of programs that manipulate programs.

What are the prospects for computer science advancing to a stage where higher-level programs become commonplace? What might be in one's standard vocabulary of functions that operate on programs?

Doug