On Tue, Feb 11, 2020 at 11:03 AM Bakul Shah <bakul(a)bitblocks.com> wrote:
I call it automiscorrect.
I've been known to same something similar. Usually with a #$%^& before
it.
First, it is very easy to mistype on these touch based
interfaces and then
they miscorrect using too large a vocabulary.
+1, amen brother Shah, amen,
At USC, back when I was a student, they started us off with PL/C, a subset
of PL/I. The PL/C compiler tried its level best to make sense of the
student programs it was given, with error messages such as “PL/C uses
....”. This was confusing to many students as they would do exactly what
PL/C said it used and yet their program didn’t work.
FWIW: I referenced both PL/C and IBM PL/1 compiler in my quora answer.
In an interactive world, offering a note like Grammerly's underline, seems
reasonable to me - because it forces me to accept it. The automatic doing
it for me, is what I dislike - as you said, on touch interfaces it's
twice as bad.
I remember having a conversation with Doug Cooper when we all were teaching
the intro to CS course and I we were getting students turning in
'auto-corrected' code for assignments and wondering why the TAs were not
amused. I had thought that having the compiler tell you what was in error
and then maybe offering a suggestion, might make sense, but there needed to
be some action on the student's part to accept >>and<< repair to code
before the compiler would produce something that 'ran.'
Anyway, I still think "*Damn Warren's Infernal Machine*" was always well
named.
Clem