On Sep 1, 2017, at 11:15 AM, Steve Johnson
<scj(a)yaccman.com> wrote:
I may just be a grumpy old fart, but I think
programming languages today are holding us back. Nearly all of them...
I think the problem is much wider than just programming languages.
Most of our PLs, programs & tools are basically for single machine things. Our
compilers, linkers, debuggers can only handle this case well. With a few exceptions.
This is not a good fit for a major class of applications today: service software. This is
typically composed of a number of loosely coupled programs running on a set of networked
machines. Managing this is done in a very ad-hoc & complicated manner (e.g.
kubernetes). A service is implemented with essentially a distributed program but our tools
don’t make it easy to compose or debug such programs. Things like dockers, containers,
jails are complicated ways of dealing with problems we run into re: security, fault
isolation, resiliency, scalability etc.
At the other extreme, dealing with GPUs or specialized processors with zillion cores, each
with a small amount of memory etc. is also painful and ad-hoc.
In this context even if you prove a single component to be bug free, how can you have
confidence in the whole system? It is well-nigh impossible to test all the gazillion ways
a system can fail on a small scale or large (e.g. a java based component ran out of file
descriptor or a packet gets misrouted or firewall rules are not quite right). Moving to
something like Cloud would be an improvement over docker & VMs but that still leaves
open the composition aspect.
Now likely there isn't a single silver bullet for all of this but our present systems
are far more complicated than they need to be.
On Sep 1, 2017, at 11:15 AM, Steve Johnson
<scj(a)yaccman.com> wrote:
> "Toby Thain: We will never reach a point where programming language
evolution stops, imho."
>
I may just be a grumpy old fart, but I think
programming languages today are holding us back. Nearly all of them...
>
> I'm currently working for a hardware company (Wave Computing). We are building
a chip with 16K 8-bit processors on it, and have plans to build systems with up to 1/4
million processors from these chips.
>
> Nevertheless, most programs today are still written pretty much like they were 25
years ago. And they are, for the most part, based on threads where the programming task
is to set out a number of steps: do this, do that, do something else, test this and if
true do this, ... A single serial thread. Things like multicore CPUs are a desperate
attempt to preserve this model while the hardware world has blown past us.
>
> Recall that parallelism is the natural state of hardware. It takes effort to make
things work sequentially. In the old days, when hardware and software used pretty much
the same model, many if not most of the hardware innovations came from first being done in
software, and then moved into hardware -- index registers, floating point, caches, etc.
etc. That process has effectively stopped. The single thread model simply no longer
fits the sweet spot of today's hardware technology.
>
> Just to underscore how far hardware has advanced: If cars had become as much cheaper
and faster as computers from 1970 to today, we could buy 1000 Tesla Model S's for a
penny and they would go 0-60,000 mph! A petabyte of data, if punched onto punch cards,
would make a card deck whose height would be 6 times the distance to the moon. If the
recent estimate of the number of bytes of data produced by the human race every day (2.5
quintillion bytes) is correct, when punched up that card deck would be 9 times the
distance to the sun.
>
> I'm not saying that there isn't a place for languages like GO and Python.
Most people will continue to think serially and design things that way. But when it comes
to implementing these designs, the current "systems" languages are left at the
starting gate. In the same way that we invented abstraction methods like functions and
processes for the old computers, we need to invent newer abstraction methods that go far
beyond co-routines and threads and message passing. If we get bogged down in telling tens
of thousands of processors "do this, do that" we will perish long before our
program works. Of particular relevance is the role that abstractions play in debugging
--they partition the job into pieces with known interfaces and behavior that can be tested
in isolation before being assembled into complete systems.
>
> Yes, I have some ideas (and not much time to work on them...) but, even if I had a
perfect solution available today, I suspect it would take decades before it caught on.
In order to accept a solution, you first have to believe there is a problem...
>
> Steve
>
>
>
> ----- Original Message -----
> From: "Toby Thain" <toby(a)telegraphics.com.au>
>
>
> We will never reach a point where programming language evolution stops,
> imho.
>
> --T
>