←back to thread

371 points ulrischa | 8 comments | | HN request time: 0.412s | source | bottom
1. throwaway314155 ◴[] No.43235006[source]
> The real risk from using LLMs for code is that they’ll make mistakes that aren’t instantly caught by the language compiler or interpreter. And these happen all the time!

Are these not considered hallucinations still?

replies(3): >>43235072 #>>43235140 #>>43237891 #
2. dzaima ◴[] No.43235072[source]
Humans can hallucinate up some API they want to call in the same way that LLMs can, but you don't call all human mistakes hallucinations; classifying everything LLMs do wrong as hallucinations would seem rather pointless to me.
replies(2): >>43235190 #>>43235270 #
3. fweimer ◴[] No.43235140[source]
I don't think it's necessarily a hallucination if models accurately reproduce the code quality of their training data.
4. ForTheKidz ◴[] No.43235190[source]
Maybe we should stop referring to undesired output (confabulation? Bullshit? Making stuff up? Creativity?) as some kind of input delusion. Hallucination is already a meaningful word and this is just gibberish in that context.

As best I can tell, the only reason this term stuck is because early image generation looked super trippy.

5. thylacine222 ◴[] No.43235270[source]
Analogizing this to human hallucination is silly. In the instance you're talking about, the human isn't hallucinating, they're lying.
replies(1): >>43235333 #
6. dzaima ◴[] No.43235333{3}[source]
I definitely wouldn't say I'm lying (...to.. myself? what? or perhaps others for a quick untested response in a chatroom or something) whenever I write some code and it turns out that I misremembered the name of an API. "Hallucination" for that might be over-dramatic but at least it it's a somewhat sensible description.
7. simonw ◴[] No.43237891[source]
I think of hallucinations as instance where an LLM invents something that is entirely untrue - like a class or method that doesn't exist, or a fact about the world that's unnoticed true.

I guess you could call bugs in LLM code "hallucinations", but they feel like a slightly different thing to me.

replies(1): >>43240004 #
8. throwaway314155 ◴[] No.43240004[source]
That's a great distinction actually. Thanks