[restricting to list; strong opinions here]
At 2024-05-24T17:17:53-0700, Bakul Shah via TUHS wrote:
What would be nice if programming languages provided
some support for
such exhaustive testing[1].
[rearranging]
At one point I had suggested turning Go's
Interface type to something
like Guttag style abstract data types in that relevant axioms are
specified right in the interface definition.
It's an excellent idea.
The underlying idea is that each type is in essence a
constraint on
what values an instance of that type can take.
In the simple form of a data type plus a range constraint, that's the
Ada definition of a subtype since day one--Ada '80 or Ada 83 if you
insist on the standardized form of the language.
40 years later we have Linus Torvalds tearing up his achievement
certificate in "kinder, gentler email interactions" just to trash the
notion of range checks on data types.[1][2][3]
Naturally, the brogrammers are quick to take Torvalds's side.[4]
Pascal had range checks too, and Kernighan famously punked on Wirth for
that. I'm not certain, but I get the feeling the latter got somewhat
over-interpreted. (To be fair to Kernighan, Pascal _as specced in the
Revised Report of 1973_[5] was in my opinion too weak a language to
leave the lab, for many of the reasons he noted. The inflexible array
typing was fatal, in my view.)
The idea was that any concrete type that implements
that interface
must satisfy its axioms.
Yes. There is of course much more to the universe of potential
constraints than range checks. Ada 2022 has these in great generality
with "subtype predicates".
http://www.ada-auth.org/standards/22aarm/html/aa-3-2-4.html
Even if the compiler ignored these axioms,
I don't understand why this idea wasn't seized upon with more force at
the CSRC. The notion of a compiler flag that turned "extra" (in the
Ritchie compiler circa 1980, this is perhaps expressed better as "any")
correctness checks could not have been a novelty. NDEBUG and assert()
are similarly extremely old even in Unix.
one can write a support program that can generate a
set of
comprehensive tests based on these axioms.
Yes. As I understand it, this is how Spark/Ada got started. Specially
annotated comments expressing predicates communicated with such a
support program, running much like the sort of automated theorem
prover you characterize below as not "retail".
In the last two revision cycles of the Ada standard (2013, 2022),
Spark/Ada's enhancements have made it into the language--though I am not
certain, and would not claim, that they compose with _every_ language
feature. Spark/Ada started life as a subset of the language for a
reason.
But C has its own subset, MISRA C, so this is hardly a reason to scoff.
[Right now a type "implementing" an
interface only needs to
have a set of methods that exactly match the interface methods but
nothing more] The underlying idea is that each type is in essence a
constraint on what values an instance of that type can take. So adding
such axioms simply tightens (& documents) these constraints. Just the
process of coming up with such axioms can improve the design (sor of
like test driven design but better!).
Absolutely. Generally, software engineers like to operationalize things
consistently enough that they can then be scripted/automated.
Evidently software testing is so mind-numblingly tedious that the will
to undertake it, even with automation, evaporates.
Now it may be that applying this to anything more
complex than stacks
won't work well & it won't be perfect but I thought this was worth
experimenting with. This would be like functional testing of all the
nuts and bolts and components that go in an airplane. The airplane may
still fall apart but that would be a "composition" error!
Yes. And even if you can prove 100% of the theorems in your system, you
may learn to your dismay that your specification was defective.
Automated provers are as yet no aid to system architects.
[1] There are "proof assisant" or formal
spec languages such as TLA+,
Coq, Isabelle etc. but they don't get used much by the average
programmer. I want something more retail!
I've had a little exposure to these. They are indeed esoteric, but also
extremely resource-hungry. My _impression_, based on no hard data, is
that increasing the abilities of static analyzers and the expressiveness
with which they are directed with predicates is much cheaper.
But a lot of programmers will not budge at any cost, and will moreover
be celebrated by their peers for their obstinacy. See footnotes.
There is much work still to be done.
Regards,
Branden
[1]
https://lore.kernel.org/all/202404291502.612E0A10@keescook/
https://lore.kernel.org/all/CAHk-=wi5YPwWA8f5RAf_Hi8iL0NhGJeL6MN6UFWwRMY8L6…
[2]
https://lore.kernel.org/lkml/CAHk-=whkGHOmpM_1kNgzX1UDAs10+UuALcpeEWN29EE0m…
[3]
https://www.businessinsider.com/linus-torvalds-linux-time-away-empathy-2018…
[4]
https://lwn.net/Articles/973108/
[5]
https://archive.org/details/1973-the-programming-language-pascal-revised-re…