←back to thread

277 points simianwords | 1 comments | | HN request time: 0s | source
Show context
e3bc54b2 ◴[] No.45149128[source]
Hallucination is all an LLM does. That is their nature, to hallucinate.

We just happen to find some of these hallucinations useful.

Let's not pretend that hallucination is a byproduct. The usefulness is the byproduct. That is what surprised the original researchers on transformer performance, and that is why the 'attention is all you need' paper remains such a phenomenon.

replies(2): >>45149484 #>>45149500 #
fumeux_fume ◴[] No.45149484[source]
> Hallucination is all an LLM does.

I wish people who take this stance would seriously reconsider their take on how hallucinations are defined and how unhelpful it is to conflate hallucination with generation from a probability distribution. I appreciate OpenAI publishing articles like this because, while the parent comment and I may have to agree to disagree on how hallucinations are defined, I can at least appeal to OpenAI's authority to say that such arguments are not only unhelpful, but also unsound.

replies(1): >>45150021 #
1. Zigurd ◴[] No.45150021[source]
You're going to get a lot of pushback on the idea of taking the definition of hallucination seriously. Calling fluently stated bunk "hallucination" feels cynical to begin with. Trying to weave a silk purse out of that sow's ear is difficult.