I read 10 comments before I realized that this was referring to 10 years in the FUTURE and not in the PAST (as would be required for it to be a hallucination).
I read 10 comments before I realized that this was referring to 10 years in the FUTURE and not in the PAST (as would be required for it to be a hallucination).
The ubiquitous use of hallucination I see is merely "something the LLM made up".
As many have said but it still bears repeating -- they're always hallucinating. I'm of the opinion that its a huge mistake to use "hallucination" as meaning "the opposite of getting it right." It's just not that. They're doing the same thing either way.
For those who genuinely don't know – hallucination specifically means false positive identification of a fact or inference (accurate or not!) that isn't supported by the LLM's inputs.
- ask for capital of France, get "London" => hallucination
- ask for current weather in London, get "It's cold and rainy!" and that happens to be correct, despite not having live weather data => hallucination
- ask for capital of DoesNotExistLand, get "DoesNotExistCity" => hallucination
- ask it to give its best GUESS for the current weather in London, it guess "cold and rainy" => not a hallucination
omg, the same for me, I was half way telling my colleague about the 100% rest kernel ...