On Mon, 26 May 2025 at 13:20, Jason Bowen <jbowen(a)infinitecactus.com> wrote:
May 26, 2025 11:57:18 Henry Bent
<henry.r.bent(a)gmail.com>:
It's like Wikipedia.
No, Wikipedia has (at least historically) human editors who supposedly
have some knowledge of reality and history.
An LLM response is going to be a series of tokens predicted based on
probabilities from its training data. The output may correspond to a ground
truth in the real world, but only because it was trained on data which
contained that ground truth.
Assuming the sources it cites are real works, it seems fine as a search
engine, but the text that it outputs should absolutely not be thought of as
something arrived at by similar means as text produced by supposedly
knowledgeable and well-intentioned humans.
An LLM can weigh sources, but it has to be taught to do that. A human can
weigh sources, but it has to be taught to do that.
-Henry