←back to thread

277 points simianwords | 2 comments | | HN request time: 0.587s | source
Show context
fumeux_fume ◴[] No.45149658[source]
I like that OpenAI is drawing a clear line on what “hallucination” means, giving examples, and showing practical steps for addressing them. The post isn’t groundbreaking, but it helps set the tone for how we talk about hallucinations.

What bothers me about the hot takes is the claim that “all models do is hallucinate.” That collapses the distinction entirely. Yes, models are just predicting the next token—but that doesn’t mean all outputs are hallucinations. If that were true, it’d be pointless to even have the term, and it would ignore the fact that some models hallucinate much less than others because of scale, training, and fine-tuning.

That’s why a careful definition matters: not every generation is a hallucination, and having good definitions let us talk about the real differences.

replies(9): >>45149764 #>>45151155 #>>45152383 #>>45154710 #>>45155176 #>>45156170 #>>45157195 #>>45166309 #>>45184453 #
druskacik ◴[] No.45157195[source]
I like this quote:

'Everything an LLM outputs is a hallucination. It's just that some of those hallucinations are true.'

replies(1): >>45166905 #
1. swores ◴[] No.45166905[source]
To me that seems as pointless as saying "everything a person sees is a hallucination, it's just some of those hallucinations are true". Sure, technically whenever we see anything it's actually our brain interpreting how light bounces off stuff and combining that with the mental models we have of the world to produce an image in our mind of what we're looking at... but if we start calling everything we see a hallucination, there's no longer any purpose in having that word.

So instead of being that pedantic, we decided that "hallucination" only applies to when what our brain thinks we see does not match reality, so now hallucination is actually a useful word to use. Equally with LLMs, when people talk about hallucinations part of the definition includes that the output be incorrect in some way. If you just go with your quote's way of thinking about it, then once again the word loses all purpose and we can just scrap it since it now means exactly the same thing as "all LLM output".

replies(1): >>45167773 #
2. 1718627440 ◴[] No.45167773[source]
> everything a person sees is a hallucination, it's just some of those hallucinations are true

Except it's not. People can have hallucinations that are true (dreams), but most perception isn't generated by your brain, but comes from the outside.