←back to thread

3337 points keepamovin | 1 comments | | HN request time: 0.193s | source
Show context
fn-mote ◴[] No.46207492[source]
The title is misleading. This isn't the correct use of the term "hallucination". Hallucination refers to making up facts, not extrapolating into the future.

I read 10 comments before I realized that this was referring to 10 years in the FUTURE and not in the PAST (as would be required for it to be a hallucination).

replies(7): >>46207522 #>>46207527 #>>46207533 #>>46207615 #>>46207629 #>>46211084 #>>46211527 #
1. alexwebb2 ◴[] No.46211084[source]
You're correct, OP used the word "hallucination" wrong. A lot of these other comments are missing the point – some deliberately ('don't they ONLY hallucinate, har har'), some not.

For those who genuinely don't know – hallucination specifically means false positive identification of a fact or inference (accurate or not!) that isn't supported by the LLM's inputs.

- ask for capital of France, get "London" => hallucination

- ask for current weather in London, get "It's cold and rainy!" and that happens to be correct, despite not having live weather data => hallucination

- ask for capital of DoesNotExistLand, get "DoesNotExistCity" => hallucination

- ask it to give its best GUESS for the current weather in London, it guess "cold and rainy" => not a hallucination