> hallucinations aren’t a bug of LLMs, they are a feature. Indeed they are the feature. All an LLM does is produce hallucinations, it’s just that we find some of them useful.
Nice.
replies(7):
It implies that some parts of the output aren’t hallucinations, when the reality is that none of it has any thought behind it.