On Tue, May 08, 2018 at 08:31:43AM -0700, Steve Johnson wrote:
Well, as I look to the future I see the whole approach
we have to
software running into a dead end. In fact, I think software is
holding us back.
What's holding us back is our human minds' ability to manage
complexity.
Starting about 2000, this changed. Hardware was no
longer offering
increased speed. But what it was offering was massive parallelism.
The response was to cling to the one instruction at a time model,
introducing multicore and its attendant hardware complexity to try to
cling to the previous model of programming. The hardware to make
this possible is expensive and does not scale.
My company, Wave Computing, has built a chip with 16,000 8-bit
processors on it. And we have plans to build systems with up to a
quarter million processors. We are breaking ground in new ways to
use hundreds of processors to solve problems very quickly. It's a
new way of thinking, and it makes your brain hurt.
And that's the problem. If we can't program the new machines, whether
it's Sun's attempt at large number of weak cores with their Niagra
chips, or your company's vast number of 8-bit processors --- as the
old saying goes, without software, it's just a doorstop.
Moving the discussion back to Unix, the problem is that we don't have
a lot of solutions to the complexity wall. The most common one seems
to be deny the existence, or the validity of the requiremnts. (e.g.,
"Teach them all English; if 7-bit ASCII was good enough for K&R, it's
good enough for me." or "Text files are all you need; why do you need
new-fangled things like group calendaring or spreadsheets"?)
Or you can just build on top of pre-existing systems, but then you get
complaints about kludge towers and overly complex systems.
Or you can give up on generality, and build a bespoke system that only
addresses the problem you are paid to solve. Which may or may not be
cheaper than using the more general solution and brute-forcing the
problem by throwing more hardware on it, on the theory that hardware
is cheaper than software engineering skills.
... In the old days, software ideas (like floating
point, index
registers, caches, etc.) proved their worth and were put into
hardware to make them faster. I think around 1980 this stopped
happening.
This is not strictly true. A counter-example would be Google's Tensor
Processing Units[1]. That's an example of custome hardware replacing
what had previously been doing in pure software, or using more general
hardware units (e.g., GPU's). Another example would be AES and
SHA-256 acceleration being done in hardware in multiple CPU
architectures, and in the data path between the CPU and the eMMC flash
on some cell phones.
[1]
https://techcrunch.com/2017/05/17/google-announces-second-generation-of-ten…
I think it is true we have hit the point of diminishing returns, in
that as software gets more complex, finding things which are commonly
used enough, and which have enough of a benefit that it is worth the
engineering effort and cost of moving that functionality into hardware
is much harder now compared to a few decades ago.
- Ted