←back to thread

3342 points keepamovin | 10 comments | | HN request time: 1.839s | source | bottom
1. fn-mote ◴[] No.46207492[source]
The title is misleading. This isn't the correct use of the term "hallucination". Hallucination refers to making up facts, not extrapolating into the future.

I read 10 comments before I realized that this was referring to 10 years in the FUTURE and not in the PAST (as would be required for it to be a hallucination).

replies(7): >>46207522 #>>46207527 #>>46207533 #>>46207615 #>>46207629 #>>46211084 #>>46211527 #
2. rrr_oh_man ◴[] No.46207522[source]
Don’t LLMs only ever hallucinate?
3. adastra22 ◴[] No.46207527[source]
There is no technical difference.
replies(1): >>46216497 #
4. hombre_fatal ◴[] No.46207533[source]
Extrapolation is a subset of hallucination.

The ubiquitous use of hallucination I see is merely "something the LLM made up".

5. madeofpalk ◴[] No.46207615[source]
It’s apt, because the only thing LLMs is hallucinate because they have no grounding in reality. They take your input and hallucinate to do something “useful” with it.
6. jrm4 ◴[] No.46207629[source]
You're right this is how people are PRESENTLY using the term "hallucination," but to me this illustrates the deeper truth about that term and that concept:

As many have said but it still bears repeating -- they're always hallucinating. I'm of the opinion that its a huge mistake to use "hallucination" as meaning "the opposite of getting it right." It's just not that. They're doing the same thing either way.

7. alexwebb2 ◴[] No.46211084[source]
You're correct, OP used the word "hallucination" wrong. A lot of these other comments are missing the point – some deliberately ('don't they ONLY hallucinate, har har'), some not.

For those who genuinely don't know – hallucination specifically means false positive identification of a fact or inference (accurate or not!) that isn't supported by the LLM's inputs.

- ask for capital of France, get "London" => hallucination

- ask for current weather in London, get "It's cold and rainy!" and that happens to be correct, despite not having live weather data => hallucination

- ask for capital of DoesNotExistLand, get "DoesNotExistCity" => hallucination

- ask it to give its best GUESS for the current weather in London, it guess "cold and rainy" => not a hallucination

8. oriettaxx ◴[] No.46211527[source]
> I read 10 comments before I realized that this was referring to 10 years in the FUTURE and not in the PAST (as would be required for it to be a hallucination).

omg, the same for me, I was half way telling my colleague about the 100% rest kernel ...

replies(1): >>46216510 #
9. isolli ◴[] No.46216497[source]
There is a semantic one.
10. isolli ◴[] No.46216510[source]
Ha ha! But yes, I was confused too, especially since the title says "10 years from now"... not specifying in which direction.