"From: "Norman Wilson" <norman(a)oclsc.org>
I'm (still) with Larry Flon on this one:
There does not now, nor will there ever, exist a programming language
in which it is the least bit hard to write bad programs."
Well, there's bad and there's bad. Programs that are the result of
flawed thinking or a misunderstanding of the language or the
environment can't be helped by a programming language. Programs with
a well thought-out and appropriate set of concepts and structures can
be hard to write if the language doesn't support these concepts and
structures. That was the point of Brian's note. If the language
doesn't support what you want to, danger may lie ahead.
A particularly vivid example for me is string handling -- I mean real
strings, with concatenation and the ability to add characters onto the
beginning, middle or end of the string . At one point in the past,
Bell Labs and Xerox Parc were two groups that had produced some
impressive programs. I spent some time at Parc, and ended up envious
about the ability of LISP programmers to dynamically add on to almost
anything with a high degree of safety. This really changed the way
one thought about some programs. Doing strings in C usually involves
pointers pointing at small things, which, given how easy it is to
index past the end, was like playing with razor blades. C++
allowed you to surround such things with safety nets, but sometimes
the performance impact was unfortunate. A similar example is the
ability to grow an array. This plays very badly with C's pointers --
if you move the array, then anybody pointing to it (and, often, there
are many such) has a stale pointer-bomb that could go off at any time
without warning. If you don't move the array, but add on additional
pieces, then it is no longer contiguous, so some operations (like
scanning all the elements) get very complicated as well.
I don't think there would be much argument that writing a garbage
collector from scratch in plain C is hard and dangerous. Using a
language that has garbage collection built in makes sense, assuming
you can afford the performance penalty. Using a better algorithm is
a better bet, but may be impossible.
The C/C++ model of a single large common address space has caused
problems for parallel programs for decades, first with message passing
and later with multicore. In multicore, the "solution" is
increasingly sophisticated caches and cache coherency circuits that
can end up thrashing and making things slower if not well
understood. And the best solution may well be dependent of the
particular chip.
These are, for the most part, examples of how powerful and useful
models don't fit into C's world view very comfortably. A lot of C's
power, and Pascal's deficiencies, come because C doesn't prevent you
from doing things that are outside of its model, and Pascal does.
Maybe it would be more precise to say that some languages strictly
enforce their world view, making programs that accord with that world
view easy and others difficult. And other languages loosely enforce
their world view, making it possible to do more than the former
languages, often with performance advantages, but with more danger
that bugs will not be detected.
I guess I like Demer's wording better...
Steve