Hi Warner,
At 2025-05-25T20:33:08-0600, Warner Losh wrote:
On Sun, May 25, 2025, 8:14 PM G. Branden Robinson <
g.branden.robinson(a)gmail.com> wrote:
[...]
> At 2025-05-25T16:13:58-0400, Norman Wilson wrote:
> [...]
> > LLMs are not search engines. They are bullshit generators.
>
> I had a similarly bad experience with troff history and the
> AI-generated "answer" Google situated in prime visual real estate.
[...]
Some things it's quite good on. Other things it's terrible but
plausible sounding like this. You never know which you'll get and as a
non-expert, you can't know which is which.
That's why I think Norman has sussed it out accurately. LLMs are
fantastic bullshit generators in the Harry G. Frankfurt sense,[1]
wherein utterances are undertaken neither to enlighten nor to deceive,
but to construct a simulacrum of plausible discourse. BSing is a close
cousin to filibustering, where even plausibility is discarded, often for
the sake of running out a clock or impeding achievement of consensus.
Not by accident is BS the coin of the realm in politics, and in the
upper echelons of many hierarchical organizations generally.
"...we are all capable of believing things which we know to be untrue,
and then, when we are finally proved wrong, impudently twisting the
facts so as to show that we were right. Intellectually, it is possible
to carry on this process for an indefinite time: the only check on it is
that sooner or later a false belief bumps up against solid reality,
usually on a battlefield." -- George Orwell, "In Front of Your Nose"
I'm supposed to be on vacation. I should get back to it. ;-)
Regards,
Branden
[1]
https://press.princeton.edu/books/hardcover/9780691122946/on-bullshit
a satisfying little essay, short enough for most attention spans