On Wed, Jul 10, 2024 at 9:54 PM John R Levine <johnl(a)taugh.com> wrote:
On Wed, 10 Jul 2024, Dan Cross wrote:
It's not clear to me why you suggest with
such evident authority that
Knuth was referring only to serialized instruction emulation and not
something like JIT'ed code; true, he doesn't specify one way or the
other, but I find it specious to conclude that that implies the
technique wasn't already in use, or at least known.
The code on pages 205 to 211 shows an instruction by instruction
interpreter. I assume Knuth knew about JIT compiling since Lisp systems
had been doing it since the 1960s, but that's not what this section of the
book is about.
Sure. But we're trying to date the topic here; my point is that
JITing was well known, and simulation was similarly well known; we
know when work on those books started; it doesn't seem that odd to me
that combining the two would be known around that time as well.
One of the later volumes of TAOCP was supposed to be
about
compiling, but it seems unlikely he'll have time to write it.
Yes; volumes 5, 6 and 7 are to cover parsing, languages, and compilers
(more or less respectively). Sadly, I suspect you are right that it's
unlikely he will have time to write them.
We've been discussing batch or JIT translation of
code which gives
much better performance without a lot of hardware help.
JIT'd performance of binary transliteration is certainly going to be
_better_ than strict emulation, but it is unlikely to be _as good_ as
native code.
Well, sure, except in odd cases like the Vax compiler and reoptimizer
someone mentioned a few messages back.
I think the point about the VAX compiler is that it's an actual
compiler and that the VAX MACRO-32 _language_ is treated as a "high
level" programming language, rather than as a macro assembly language.
That's not doing binary->binary translation, that's doing
source->binary compilation. It's just that, in this case, the source
language happens to look like assembler for an obsolete computer.
- Dan C.