I’ve never heard of a Computer Science or Software Engineering program
that included a ‘case study’ component, especially for Software Development &
Projects.
MBA programs feature an emphasis on real-world ‘case studies’, to learn from successes
& failures,
to give students the possibility of not falling into the same traps.
Creating Unix V6, because it profoundly changed computing & development,
would seem an obvious Case Study for many aspects of Software, Coding and Projects.
There have been many descriptive treatments of Initial Unix,
but I’ve never seen a Case Study,
with explicit lessons drawn, possibly leading to metrics to evaluate Project progress
& the coding process.
Developers of Initial Unix arguably were 10x-100x more productive than IBM OS/360, a ‘best
practice’ development at the time,
so what CSRC did differently is worth close examination.
I’ve not seen examined the role of the ‘capability’ of individual contributors, the
collaborative, collegiate work environment
and the ‘context’, a well funded organisation not dictating deadlines or product
specifications for researchers.
USG, then USL, worked under ’normal commercial’ management pressure for deadlines,
features and specifications.
The CSRC/1127 group did have an explicit approach & principles for what they did and
how they worked,
publishing a number of books & papers on them - nothing they thought or did is secret
or unavailable for study.
Unix & Unix tools were deliberately built with explicit principles, such as “Less is
More”.
Plan 9 was also built on explicit Design principles.
The two most relevant lessons I draw from Initial Unix are:
- the same as Royce's original “Software Waterfall” paper,
“build one to throwaway” [ albeit, many, many iterations of the kernel & other code
]
- Writing Software is part Research, part Creative ‘Art’:
It’s Done when it's Done, invention & creation can’t be timetabled.
For the most reliable, widely used Open Source projects,
the “Done when it’s Done” principle is universally demonstrated.
I’ve never seen a large Open Source project succeed when attempting to use modern “Project
Management” techniques.
These Initial Unix lessons, if correct and substantiated, should cause a revolution in the
teaching & practice
of Professional Programming, i.e. Programming In the Large, for both CS & SW.
There are inherent contradictions within the currently taught Software Project Management
Methodologies:
- Individual capability & ability is irrelevant
The assumption is ‘programmers’ are fungible/ identical units - all equally able to
solve any problem.
Clearly incorrect: course evaluations / tests demonstrate at least a 100x variance in
ability in every software dimension.
- Team environment, rewards & penalties and corporate context are irrelevant,
Perverse incentives are widely evident, the cause of many, or all, “Death Marches”.
- The “Discovery & Research Phases” of a project are timetabled, an impossibility.
Any suggestions for Case Studies gratefully accepted.
===========
Professions & Professionals must learn over time:
there’s a negative aspect (don’t do this) and positive aspect (do this) for this
deliberate learning & improvement.
Negatives are “Don’t Repeat, or Allow, Known Errors, Faults & Failures”
plus in the Time Dimension, “Avoid Delays, Omissions and Inaction”.
Positives are what’s taught in Case Studies in MBA courses:
use techniques & approaches known to work.
Early Unix, from inception to CACM papers, 1969 to 1974, took probably 30 man-years,
and produced a robust, performant and usable system for it’s design target, “Software
Development”.
This in direct comparison to Fred Brooks IBM OS/360 effort around 5 years before that
consumed 3,000-4,000 man-years
was known for bugs, poor & inconsistent code quality, needed large resource to run and
was, politely, non-performant.
This was a commercial O/S, built by a capable, experienced engineering organisation,
betting their company on it,
who assigned their very best to the hardware & software projects. It was “Best of
Breed” then, possibly also now.
MULTICS had multiple business partners, without the same, single focus or commercial
imperative.
I don’t believe it’s comparable to either system.
Initial Unix wasn’t just edit, compile & run, but filesystems, libraries, debugging
& profiling tools, language & compiler construction tools, ‘man’ pages, document
prep (nroff/troff) and 'a thousand' general tools leveraging shell / pipe.
This led directly to modern toolchains, config, make & build systems, Version Control,
packaging systems, and more.
Nothing of note is built without using descendants or derivatives of these early
toolchains.
All this is wrapped around by many Standards, necessary for portable systems, even based
on the same platform, kernel and base system.
The “Tower of Babel” problem is still significant & insurmountable at times, even in
C-C & Linux-Linux migration,
but without POSIX/IEEE standards the “Software Tarpit” and "Desert of Despair”
would’ve been unsolvable.
The early Unix system proved adaptable and extensible to many other environments, well
beyond “Software Development”.
===========
[ waterfall model ]
Managing the development of large software systems: concepts and techniques
W. W. Royce, 1970 [ free access ]
<https://dl.acm.org/doi/10.5555/41765.41801>
STEP3: DO IT TWICE, pg 334
After documentation, the second most important criterion for success revolves around
whether the product is totally original.
If the computer program in question is being developed for the first time,
arrange matters so that the version finally delivered to the customer for operational
deployment
is actually the second version insofar as critical design/operations areas are
concerned.
===========
Plan 9, Design
<https://9p.io/sys/doc/9.html>
The view of the system is built upon three principles.
First, resources are named and accessed like files in a hierarchical file system.
Second, there is a standard protocol, called 9P, for accessing these resources.
Third, the disjoint hierarchies provided by different services are joined together into a
single private hierarchical file name space.
The unusual properties of Plan 9 stem from the consistent, aggressive application of these
principles.
===========
Escaping the software tar pit: model clashes and how to avoid them
Barry Boehm, 1999 [ free access ]
<https://dl.acm.org/doi/abs/10.1145/308769.308775#>
===========
Mythical Man-Month, The: Essays on Software Engineering,
Anniversary Edition, 2nd Edition
Fred Brooks
Chapter 1. The Tar Pit
Large-system programming has over the past decade been such a tar pit, and many great and
powerful beasts have thrashed violently in it.
===========
--
Steve Jenkin, IT Systems and Design
0412 786 915 (+61 412 786 915)
PO Box 38, Kippax ACT 2615, AUSTRALIA
mailto:sjenkin@canb.auug.org.au
http://members.tip.net.au/~sjenkin