Google was founded on this day in 1998, by a couple of uni students who
went on to become millionaires; the verb "to google" has in fact entered
the lexicon, but these days I use DuckDuckGo (remember: you are not
Google's customer, but their product).
There is some confusion over the name "Google"; some say that someone
couldn't spell the mathematical number "googol" (10**100), and others
claim that is was a joke on their part, which I suspect is revisionist
history.
I would appreciate a correction, and need I point out that a web reference
may not cut the mustard?
Thanks.
-- Dave
We gained computer pioneer John Mauchly on this day in 1907; he was best
known as the co-inventor of ENIAC, one of the world's first computers.
-- Dave
On this day in 2007, the Storm worm hit the Internet; it was estimated
that 57 million spams were sent in one day (most likely all from
compromised Windoze boxes, as if there's any other sort).
-- Dave
On Aug 18, 2018, at 12:57 PM, Perry E. Metzger <perry(a)piermont.com> wrote:
>
> Sorry for the thread necromancy, but this is a _very_ important
> topic. Perhaps it doesn't belong on tuhs but rather on coff.
Surely 12 days is not all that long a period?!
> This is a pretty long posting. If you don't care to read it, the TL;DR
> is that formal specification and verification is now a real
> discipline, which it wasn't in the old days, and there are systems to
> do it in, and it's well understood.
>
> On 2018-08-06 at 08:52 -0700, Bakul Shah wrote:
>>
>> What counts as a "formal spec"? Is it like Justice Potter
>> Stewart's "I know it when I see it" definition or something
>> better?
>
> At this point, we have a good definition. A formal specification is a
> description of the behavior of a program or piece of hardware in a
> precise machine-readable form that may be used as the basis for fully
> formal verification of the behavior of an implementation. Often these
> days, the specification is given in a formal logic, such as the
> predicative calculus of inductive constructions, which is the logic
> underlying the Coq system.
What about Denotational Semantics? Or efforts such as Dines
Bjørner's META-IV[1] and later VDM? I'd consider them formal
specification systems even if not machine-readable or fully
automatically verifiable. Perhaps the definition has tightened
up now that we have things like Coq and Lamport's TLA+.
>
> Isabelle/HOL is another popular system for this sort of work. ACL2 is
> (from what I can tell) of more historical interest but it has
> apparently been used for things like the formal verification of
> floating point and cache control units for real hardware. (It is my
> understanding that it has been many years since Intel would dare
> release a system where the cache unit wasn't verified, and the only
> time in decades it tried to release a non-verified FPU, it got the
> FDIV bug and has never tried that again.) There are some others out
> there, like F*, Lean, etc.
>
> Formal specifications good enough for full formal verification have
> been made for a variety of artifacts along with proofs that the
> artifacts follow the specification. There's a fully formally verified
> C compiler called CompCert for example, based on an operational
> semantics written in Coq. There's another formal semantics for C
> written in K, which is a rather different formal system. There's a
> verified microkernel, seL4, whose specification is written in
> Isabelle/HOL. There's a fully formal specification of the RISC V, and
> an associated verified RTL implementation.
seL4's verification is described here:
https://www.sigops.org/sosp/sosp09/papers/klein-sosp09.pdf
They used Haskell to implement an "intermediate layer"
between OS implementers and formal methods practitioners.
The *generated* Isobelle scripts weigh in at 200K lines so
I don't yet know what to think of this. There are a lot
of assumptions made and crucially, for an OS kernel, they
sidestep concurrency & non-determinism verification by
using an event based model to avoid dealing with async.
interrupts. [On the other hand, this is a 2009 paper and
more work may have been done since to improve things].
> Generally speaking, a formal specification:
>
> 1) Must be machine readable
> 2) The semantics of the underlying specification language must
> themselves be extremely precisely described. You can't prove the
> consistency and completeness of the underlying system (see Gödel)
> but you _can_ still strongly specify the language.
> 3) Must be usable for formal (that is, machine checkable) proofs that
> implementations comply with the spec, so it has to be sufficiently
> powerful. Typical systems are now equivalent to higher order logic.
...
> In recent years, I've noted that "old timers" (such as many of us,
> myself included) seem to be unaware of the fact that systems like Coq
> exist, or that it is now relatively (I emphasize _relatively_) routine
> for substantial systems to be fully formally specified and then fully
> formally verified.
This is good to know.
But I still think these are merely different levels of
abstractions. The issue then is how to prove a specification
at a given level correct and how to prove that the mapping
to a more concrete level implements the same semantics.
I also wonder about the "_relatively_ routine" application
of formal verification. May be for the Intels of the world
but certainly not for 99.99..9% of software. Though hopefully
it *will* become more common.
Thanks to Hellwig Geisse, Noel Chiappa, Steve Johnson, &
especially Perry Metzger for their replies.
Bakul
[1]
Aside: I sort of stumbled upon this area decades ago when
I picked up a stack of tech. reports Prof. Per Brinch-Hansen
had thrown out. It had among other things Bjørner's tutorial
on META-IV and Sussman & Steele's orig. Scheme report. These
led me to denotational semantics via Milner & Strachey's
Theory of Programming Language Semantics book.
I've forwarded this e-mail from the TUHS list. Feel free to respond
to the thread here.
Thanks, Warren
----- Forwarded message from "Perry E. Metzger" <perry(a)piermont.com> -----
Date: Sun, 19 Aug 2018 20:57:58 -0400
From: "Perry E. Metzger" <perry(a)piermont.com>
To: George Michaelson <ggm(a)algebras.org>
Cc: TUHS main list <tuhs(a)minnie.tuhs.org>
Subject: Re: [TUHS] Formal Specification and Verification (was Re: TUHS Digest,
Vol 33, Issue 5)
On Mon, 20 Aug 2018 09:47:58 +1000 George Michaelson
<ggm(a)algebras.org> wrote:
> Witness to this, the interest in *trying* to apply Coq to Ethereum
> smart contracts...
The Tezos cryptocurrency was created with a smart contract language
specifically designed for formal verification.
> I think Perry wrote something of long term value. I encourage you to
> write this up somewhere you'd be willing to have published like a
> tech blog.
Not sure where one would publish it.
I also note that the general response here was the one I almost always
get when I mention this stuff to people, which is near silence. (I
used to get exactly this response 30 or 35 years ago when explaining
the Internet to most people who had not yet come into contact with it,
so I suppose it's not overly surprising.)
That said, let me make a strong prediction: in not very many years,
pretty much everyone doing serious work in computer science (say,
designing security protocols, building mission critical systems, etc.)
will be building them using some sort of formal verification assistant
systems. I suspect this change will be as transformative as the
triumph of high level languages over the previous supremacy of machine
language coding. (For those that forget, one of the things that made
Unix successful was, of course, that unlike many other OSes of the
era, it was (ultimately) written in a portable language and not in
machine code.)
People who don't use formal verification when writing serious code
will seem as antiquated and irresponsible as people who build mission
critical systems now without test coverage. This is going to be a big
revolution.
Perry
> -G
>
> On Sun, Aug 19, 2018 at 5:57 AM, Perry E. Metzger
> <perry(a)piermont.com> wrote:
> > Sorry for the thread necromancy, but this is a _very_ important
> > topic. Perhaps it doesn't belong on tuhs but rather on coff.
> >
> > This is a pretty long posting. If you don't care to read it, the
> > TL;DR is that formal specification and verification is now a real
> > discipline, which it wasn't in the old days, and there are
> > systems to do it in, and it's well understood.
> >
> > On 2018-08-06 at 08:52 -0700, Bakul Shah wrote:
> >>
> >> What counts as a "formal spec"? Is it like Justice Potter
> >> Stewart's "I know it when I see it" definition or something
> >> better?
> >
> > At this point, we have a good definition. A formal specification
> > is a description of the behavior of a program or piece of
> > hardware in a precise machine-readable form that may be used as
> > the basis for fully formal verification of the behavior of an
> > implementation. Often these days, the specification is given in a
> > formal logic, such as the predicative calculus of inductive
> > constructions, which is the logic underlying the Coq system.
> >
> > Isabelle/HOL is another popular system for this sort of work.
> > ACL2 is (from what I can tell) of more historical interest but it
> > has apparently been used for things like the formal verification
> > of floating point and cache control units for real hardware. (It
> > is my understanding that it has been many years since Intel would
> > dare release a system where the cache unit wasn't verified, and
> > the only time in decades it tried to release a non-verified FPU,
> > it got the FDIV bug and has never tried that again.) There are
> > some others out there, like F*, Lean, etc.
> >
> > Formal specifications good enough for full formal verification
> > have been made for a variety of artifacts along with proofs that
> > the artifacts follow the specification. There's a fully formally
> > verified C compiler called CompCert for example, based on an
> > operational semantics written in Coq. There's another formal
> > semantics for C written in K, which is a rather different formal
> > system. There's a verified microkernel, seL4, whose specification
> > is written in Isabelle/HOL. There's a fully formal specification
> > of the RISC V, and an associated verified RTL implementation.
> >
> > Generally speaking, a formal specification:
> >
> > 1) Must be machine readable
> > 2) The semantics of the underlying specification language must
> > themselves be extremely precisely described. You can't prove
> > the consistency and completeness of the underlying system (see
> > G__del) but you _can_ still strongly specify the language.
> > 3) Must be usable for formal (that is, machine checkable) proofs
> > that implementations comply with the spec, so it has to be
> > sufficiently powerful. Typical systems are now equivalent to
> > higher order logic.
> >
> > From: "Hellwig Geisse" <hellwig.geisse(a)mni.thm.de>
> > Sent:Mon, 06 Aug 2018 18:30:30 +0200
> >>
> >> For me, a "formal spec" should serve two goals:
> >> 1) You can reason about the thing that is specified.
> >
> > Yes.
> >
> > 2) The spec can be "executed" (i.e., there is an
> >> interpreting mechanism, which lets the spec behave
> >> like the real thing).
> >
> > Not always reasonable.
> >
> > First, it is often the case that a spec does not describe
> > execution at all. See, for example, the specification of a
> > sorting function I give at the end of this message: it simply
> > says "a sorting function is a function such that, for all inputs,
> > the return is a non-decreasing permutation of the input". This is
> > not executable. It is a purely descriptive property, and you
> > cannot extract an executable algorithm from the spec.
> >
> > Second, even when a spec amounts to a description of execution, a
> > proof assistant often cannot actually execute it. For example,
> > although you can reason about non-terminating execution in pCiC
> > (and thus Coq), programs written in a strongly normalizing lambda
> > calculus that can be used as a logic must terminate, or functions
> > that did not terminate would be inhabitants of all types and the
> > logic would be inconsistent. Thus, you cannot execute a program
> > with infinite loops inside Coq, although you can reason about
> > them (and indeed, you can reason about coinductively defined
> > objects like infinite execution traces.)
> >
> > On Mon, 06 Aug 2018 14:19:31 -0700 "Steve Johnson"
> > <scj(a)yaccman.com> wrote:
> >> I take a somewhat more relaxed view of what a spec should be:
> >> It should describe a program with enough completeness that a
> >> competent programmer could write it from the spec alone.
> >
> > I think this is a bit more relaxed than is currently accepted.
> >
> >> The formal systems I have seen would roll over and die when
> >> presented with even a simple compiler.__
> >
> > I don't know what this means. If it is that there aren't
> > implementations of languages like pCiC, that's not true, see Coq.
> > If it means no one can formally specify a compiler, given that
> > formally verified compilers exist, that's also not true.
> >
> > The "final theorem" proving the correctness of CompCert depends on
> > having an operational semantics of both C and the target
> > architecture, and says (more or less) that the observed behavior
> > of the input program in C is the same as the observed behavior of
> > the output program (say in ARM machine language). This is a
> > serious piece of work, but it is also something that has actually
> > been done -- the tools are capable of the task.
> >
> >> Additionally, being able to specify that a particular function be
> >> carried out by a heapsort, for example, would require that the
> >> formalism could describe the heapsort and prove it correct.__
> >> These don't grow on trees...
> >
> > Formally verifying a couple of sorting algorithms described in
> > Coq is a exercise for an intro level class on formal
> > verification. I've done it. Once you have the proper primitives
> > described, the specification for a sorting algorithm in Coq looks
> > like this:
> >
> > Definition is_a_sorting_algorithm (f: list nat -> list nat) :=
> > forall al, Permutation al (f al) /\ sorted (f al).
> >
> > That says "the property "is_a_sorting_algorithm" over a function
> > from lists of natural numbers to lists of natural numbers is that
> > the output is a permutation of the input in which all the
> > elements are in non-decreasing order." The definitions in
> > question are very precise. For example, one definition of sorted
> > (the property of being a non-decreasing list) is:
> >
> > Definition sorted (al: list nat) :=
> > forall i j, i < j < length al -> nth i al 0 <= nth j al 0.
> >
> > and the property of being a permutation, which is relatively
> > complicated inductively defined property, is:
> >
> > Inductive Permutation {A : Type} : list A -> list A -> Prop :=
> > perm_nil : Permutation
> > | perm_skip : forall (x : A) (l l' : list A),
> > Permutation l l' ->
> > Permutation (x :: l) (x :: l')
> > | perm_swap : forall (x y : A) (l : list A),
> > Permutation (y :: x :: l) (x :: y :: l)
> > | perm_trans : forall l l' l'' : list A,
> > Permutation l l' ->
> > Permutation l' l'' ->
> > Permutation l l''.
> >
> > Coq starts out with basically nothing defined, by the way. Notions
> > such as "natural number" and "list" are not built in. Peano
> > naturals are defined in the system thusly:
> >
> > Inductive nat : Type :=
> > | O : nat
> > | S : nat -> nat.
> >
> > The underlying basis of Coq (i.e. the Predicative Calculus of
> > Inductive Constructions) is a dependently typed lambda calculus
> > that's astonishingly simple, and the checker for proofs in the
> > system is only a few hundred lines long -- the checker is the
> > only portion of the system which needs to be trusted.
> >
> > In recent years, I've noted that "old timers" (such as many of us,
> > myself included) seem to be unaware of the fact that systems like
> > Coq exist, or that it is now relatively (I emphasize
> > _relatively_) routine for substantial systems to be fully
> > formally specified and then fully formally verified.
> >
> >
> > Perry
> > --
> > Perry E. Metzger perry(a)piermont.com
>
--
Perry E. Metzger perry(a)piermont.com
----- End forwarded message -----
Forty years ago Motorola 680x0 CPUs powered most good Unix boxen, with
the exception of this upstart SPARC thing. And then they were gone.
I'm trying to remember why. Can anybody help me? I recall claims
that Moto didn't put enough effort into development, but was this
primarily a technical or a commercial issue?
Greg
--
Sent from my desktop computer.
Finger grog(a)lemis.com for PGP public key.
See complete headers for address and phone numbers.
This message is digitally signed. If your Microsoft mail program
reports problems, please read http://lemis.com/broken-MUA
On Tuesday, 7 August 2018 at 10:51:12 +1000, Dave Horsfall wrote:
> ...; you won't believe the number of "history" sites I've seen that
> directly contradict each other (even when taking UTC/US/local time
> into account, and I try and use local time wherever possible).
To further confuse the issue? With UTC we know where we are.
Greg
--
Sent from my desktop computer.
Finger grog(a)lemis.com for PGP public key.
See complete headers for address and phone numbers.
This message is digitally signed. If your Microsoft mail program
reports problems, please read http://lemis.com/broken-MUA
[Moved from TUHS to COFF]
On 2018-Aug-02 08:44:56 -0400, Doug McIlroy <doug(a)cs.dartmouth.edu> wrote:
>My collection of early computer manuals includes Brinch Hansen's manual
>for the RC 4000, which stands out for its precise description of the
>CPU logic--in Algol 60! It's the only manual I have seen that offers a
>good-to-the-last-bit formal description of the hardware.
The book "A Programming Language" by Kenneth Iverson included a formal
description of the IBM 7090 in Iverson Notation (now APL). I believe that
is the first formal description of any computer. The success of that led
IBM to include a formal description of the System/360 architecture in the
IBM Systems Journal issue introducing the S/360.
I've been told that IBM has since regretted that decision since it opened
the way for other manufacturers to clone the S/360.
--
Peter Jeremy
One of the first digital computers in the world[*], the Harvard Mk I, was
dedicated on this day in 1944; it was an enormous electromechanical beast.
[*]
Please, no discussion on what constitutes a "digital computer" as we've
pretty much done that to death; it all comes down to a matter of
definition to suit one's agenda.
-- Dave