> hallucinations aren’t a bug of LLMs, they are a feature. Indeed they are the feature. All an LLM does is produce hallucinations, it’s just that we find some of them useful.
Nice.
replies(7):
Nice.
So the only real difference between "perception" and a "hallucination" is whether it is supported by physical reality.