> hallucinations aren’t a bug of LLMs, they are a feature. Indeed they are the feature. All an LLM does is produce hallucinations, it’s just that we find some of them useful.
Nice.
replies(7):
Nice.
This is different from human hallucinations where it makes something up because of something wrong with the mind rather than some underlying issue with the brain's architecture.
It's possible LLMs are lying but my guess is that they really just can't tell the difference.