←back to thread

3337 points keepamovin | 1 comments | | HN request time: 0.206s | source
Show context
fn-mote ◴[] No.46207492[source]
The title is misleading. This isn't the correct use of the term "hallucination". Hallucination refers to making up facts, not extrapolating into the future.

I read 10 comments before I realized that this was referring to 10 years in the FUTURE and not in the PAST (as would be required for it to be a hallucination).

replies(7): >>46207522 #>>46207527 #>>46207533 #>>46207615 #>>46207629 #>>46211084 #>>46211527 #
1. madeofpalk ◴[] No.46207615[source]
It’s apt, because the only thing LLMs is hallucinate because they have no grounding in reality. They take your input and hallucinate to do something “useful” with it.