On Mar 22, 2018, at 2:35 PM, Clem Cole <clemc(a)ccc.com> wrote:
The question (and a good one) is if you are not 'a Fortran person,' are you
going to be able to understand it well enough to not do damage to it, if you modify it?
Which is of course the crux the question Bakul is asking.
This is indeed the case, but I am asking not just about
Fortran. Will we continue programming in 50-100 years in
C/C++/Java/Fortran? That is, are we going through the
Cambrian Explosion of programming languages now and it will
settle down in a few decades or have we just started?
I suspect, it is not as bad the science fiction movies
profess. Because the codes have matured over time, which is not what happened with Y2K.
Are their 'dusty decks' sure - but they are not quite so dusty as it might
think, and as importantly, the code we care about are and have been in continuous use for
years. So there has been a new generation of programmers that took of the maintenance of
them.
Perhaps a more important question is what % of programs are
important enough and will be around in 50-100 years.
On Mar 23, 2018, at 3:43 AM, Tim Bradshaw <tfb(a)tfeb.org> wrote:
My experience of large systems like this is that this isn't how they work at all.
The program I deal with (which is around 5 million lines naively (counting a lot of stuff
which probably is not source but is in the source tree)) is looked after by probably
several hundred people. It's been through several major changes in its essential
guts and in the next ten years or so it will be entirely replaced by a new version of
itself to deal with scaling problems inherent in the current implementation. We get a new
machine every few years onto which it needs to be ported, and those machines have not all
been just faster versions of the previous one, and will probably never be so.
By now most major systems has been computerized. Banks,
govt, finance, communication, shipping, various industries,
research, publishing, medicine etc. Will the critical
systems within each area have as many resources as & when
needed as weather forecasting system Tim is talking about?
[Of course, the same question can be asked in relation to
the conversion I am wondering about!]
On Mar 23, 2018, at 8:51 AM, Ron Natalie <ron(a)ronnatalie.com> wrote:
A core package in a lot of the geospatial applications is a old piece of mathematical
code originally written in Fortran (probably in the sixties). Someone probably in the
80’s recoded the thing pretty much line for line (maintaining the horrendous F66 variable
names etc…) into C. It’s probably ripe for a jump to something else now.
We’ve been through four major generations of the software. The original was all VAX
based with specialized hardware (don’t know what it was written in). We followed that
on with a portable UNIX (but mostly Suns, but ours worked on SGI, Ardent, Stellar, various
IBM AIX platofrms, Apollo DN1000’s, HP, DEC Alphas). This was primarily a C application.
Then right about the year 2000, we jumped to C++ on Windows. Subsequently it got
back ported to Linux. Yes there are some modules that have been unchanged for decades,
but the system on the whole has been maintained.
I wonder if we will continue doing this sort of adhoc
but expensive rewrites for a long time....
[This may be somewhat relevant to TUHS, from a future
historian's perspective :-)]