←back to thread

277 points simianwords | 1 comments | | HN request time: 0s | source
Show context
yreg ◴[] No.45156255[source]
Maybe it goes against the definition but I like saying that _all_ output is a hallucination, when explaining LLMs.

It just happens that a lot of that output is useful/corresponding with the real world.

replies(1): >>45156423 #
kelnos ◴[] No.45156423[source]
Well yes, it goes against the accepted definition. And if all output is hallucination, then it's not really a useful way to describe anything, so why bother?
replies(3): >>45156460 #>>45156875 #>>45169827 #
1. yreg ◴[] No.45169827[source]
I find it useful to underline the intrinsic properties of LLMs. When an LLM makes up something untrue, it's not a 'bug'.

I think that thinking of all LLM output as 'hallucinations' while making use of the fact that these hallucinations are often true for the real world is a good mindset, especially for nontechnical people, who might otherwise not realise.