On Mon, Feb 17, 2020 at 04:17:18PM -0800, Jon Steinhart wrote:
Richard Salz writes:
'The problem is that the ecosystem has been
fragmented by people doing
their "documentation" in their preferred
formats instead of in a common
(man) format.
Damn those unauthorized developers. How dare they write code that doesn't
meet standards.
Get off my lawn.
The relevant TUHS part of it that maybe some folks here can speak to is how
did UNIX remain so cohesive for so long? How were decisions made? Of course,
this started to fall apart with System III and such as things got more clunky.
I think that part of it was that machines were small, both in memory and in
disk. I did a huge programming project because I wanted to compress the
pathalias output; I had 20 users on a 40MB disk. So big == evil.
The other thing, if we're talking about kernels, uniprocessor kernels
were pretty simple to understand compared to SMP, and NUMA, and the PCI
devices tree and a million other things that modern computers had.
v6 was documented in the lion book, you could read it all and understand
it in maybe a week or two? That's not a thing any more.
I've probably said this before, but today I see
way too much "string theory
programming". What I mean by that is the "I have an idea so I'll just
start
my own universe that doesn't play well with others rather than extending the
existing ecosystem" model. That's my beef with texinfo; there was already
an existing functional system and rather than making some improvements a new
incompatible universe was created.
Yeah, you need a dictator that says that's not OK.