On Fri, Mar 23, 2018 at 3:28 PM, Bakul Shah <bakul(a)bitblocks.com> wrote:
By now most major systems has been computerized. Banks,
govt, finance, communication, shipping, various industries,
research, publishing, medicine etc. Will the critical
systems within each area have as many resources as & when
needed as weather forecasting system Tim is talking about?
[Of course, the same question can be asked in relation to
the conversion I am wondering about!]
I suspect we agree more than we disagree.
I offer the following observation. Except for high end HPC particularly
DoD, DoE and big science types of applications, there has been a
'Christiansen' style disruption were a 'worse' technology was created
and
loved by a new group of users and that new technology eventually got better
and replaced (disrupted) the earlier one (Banks/Finance were cobol - now
its Oracle and the like, SAP et al; Communications was SS7 over custom HW,
now its IP running on all sorts of stuff). The key is the disruptor was on
a economic curve that make it successful.
But HE HPC is the same people, doing the same things they did before -- the
difference is the data sets are larger, need for better precision,
different data representation (e.g., graphics). Again, the math has not
changed. But I don't see a new customer for those style of applications,
which is what is needed to basically bank roll the replacement (originally
less 'good') technology. The economics are their to replace it - at least
so far.
The idea of the 'under served' or 'missing middle' market for HPC has
been
discussed for a bit. I used to believe it. I'm not so sure now.
Which bringing this back to UNIX. Linux is the UNIX disruptor - which is
great. Linux keeps 'UNIX' alive and getting better. I don't see an
economic reason to replace it, but who knows. Maybe that's what the new
good folks at Goggle, Amazon, Intel or some University is doing. But so
far, the economics is not there.
Clem
ᐧ