←back to thread

277 points simianwords | 3 comments | | HN request time: 0.205s | source
Show context
amelius ◴[] No.45149170[source]
They hallucinate because it's an ill-defined problem with two conflicting usecases:

1. If I tell it the first two lines of a story, I want the LLM to complete the story. This requires hallucination, because it has to make up things. The story has to be original.

2. If I ask it a question, I want it to reply with facts. It should not make up stuff.

LMs were originally designed for (1) because researchers thought that (2) was out of reach. But it turned out that, without any fundamental changes, LMs could do a little bit of (2) and since that discovery things have improved but not to the point that hallucination disappeared or was under control.

replies(10): >>45149354 #>>45149390 #>>45149708 #>>45149889 #>>45149897 #>>45152136 #>>45152227 #>>45152405 #>>45152996 #>>45156457 #
1. ninetyninenine ◴[] No.45149897[source]
Did you read the article? You’re going on some generic tangent and regurgitating the same spiel about LLMs that you see all over the internet.

I mean it’s plain that you have an orthogonal (though generic) opinion on why LLMs hallucinate but how does that relate to the article? How does your opinion which you blatantly just dropped as if it’s the final opinion override the opinion of the article?

Seems off topic honestly.

replies(3): >>45153987 #>>45157432 #>>45166619 #
2. raincole ◴[] No.45153987[source]
Generally HN commenters don't read the article. They use the title as a prompt to express their opinions on a specific topic.
3. simianwords ◴[] No.45157432[source]
I agree. It’s just people who have a different view taking their opportunity to vent out their frustration.