[try-II]
On Fri, Mar 23, 2018 at 6:43 AM, Tim Bradshaw <tfb(a)tfeb.org> wrote:
On 22 Mar 2018, at 21:05, Bakul Shah
<bakul(a)bitblocks.com> wrote:
I was thinking about a similar issue after reading Bradshaw's
message about FORTRAN programs being critical to his country's
security. What happens in 50-100 years when such programs have
been in use for a long time but none of the original authors
may be alive? The world may have moved on to newer languages
and there may be very few people who study "ancient" computer
languages and even they won't have in-depth experience to
understand the nuances & traps of these languages well enough.
No guarantee that FORTRAN will be in much use then! Will it be
like in science fiction where ancient spaceships continue
working but no one knows what to do when they break?
My experience of large systems like this is that this isn't how they work
at all. The program I deal with (which is around 5 million lines naively
(counting a lot of stuff which probably is not source but is in the source
tree)) is looked after by probably several hundred people. It's been
through several major changes in its essential guts and in the next ten
years or so it will be entirely replaced by a new version of itself to deal
with scaling problems inherent in the current implementation. We get a new
machine every few years onto which it needs to be ported, and those
machines have not all been just faster versions of the previous one, and
will probably never be so.
What it doesn't do is to just sit there as some sacred artifact which
no-one understands, and it's unlikely ever to do so. The requirements for
it to become like that would be at least that the technology of large-scale
computers was entirely stable, compilers, libraries and operating systems
had become entirely stable and people had stopped caring about making it do
what it does better. None of those things seems very likely to me.
(Just to be clear: this thing isn't simulating bombs: it's forecasting the
weather.)
+1 - exactly
my
point.
We have drifted a bit from pure UNIX, but I actually do think this is
relevant to UNIX history. Once UNIX started to run on systems targeting
HPC loads where Fortran was the dominate programming language, UNIX quickly
displaced custom OSs and became the dominant target even if at the
beginning of that transition
as
the 'flavor' of UNIX did vary (we probably can and should discuss how that
happened and why independently
-- although
I will point out the UNIX/Linux implementation running at say LLNL != the
version running at say Nasa Moffitt). And the truth is today, for small
experiments you probably run Fortran on Windows on your desktop. But for
'production' - the primary OS for Fortran is a UNIX flavor of some type
and has been that way since the mid-1980s - really starting with the UNIX
wars of that time.
As I also have said here and elsewhere, while HPC and very much its
lubricant, Fortran, are not something 'academic CS types' like to study
these days
- even though
Fortran (HPC) pays my
and many of our
salar
ies
. Yet it runs on the system the those same academic types all prefer -
*i.e.* Ken and Dennis' ideas. The primary difference is the type of
program the users are running. But Ken and Dennis ideas work well for
almost all users and spans
specific
application market
s.
Here is a
picture
I did a few years ago for a number of Intel exec's. At the time I was
trying to explain to them that HPC is not a single style of application and
also help them understand that there two types of value - the code itself
and the data. Some markets (
*e.g.*
Financial) use public data but the methods they use
to crunch it
(
*i.e.* the
code
s
)
are
private, while others
market segments
might have private data (*e.g.*
oil and gas) but
different customers
use the same or similar codes to crunch it.
F
or this discussion, think about how much of the code I sho
w
below is complex arithmetics -
while
much of it is searching
google style
, but a lot is
just plain
nasty math. The 'nasty math' that has not changed
over the years
and thus those codes are dominated by Fortran.
[Note Steve has pointed out that with AI maybe the math could change in
the future - but certainly so far, history of these markets is basically
differential equations solvers].
As Tim says, I really can not
see
that changing and
the
reason (I believe) is I do not see any
compelling
economic reason to do so.
Clem
ᐧ
ᐧ