> My former colleague Rebecca Parsons, has been saying for a long time that hallucinations aren’t a bug of LLMs, they are a feature. Indeed they are the feature. All an LLM does is produce hallucinations, it’s just that we find some of them useful.
What a great way of framing it. I've been trying to explain this to people, but this is a succinct version of what I was stumbling to convey.
replies(5):