←back to thread

416 points floverfelt | 1 comments | | HN request time: 0.22s | source
Show context
sebnukem2 ◴[] No.45056066[source]
> hallucinations aren’t a bug of LLMs, they are a feature. Indeed they are the feature. All an LLM does is produce hallucinations, it’s just that we find some of them useful.

Nice.

replies(7): >>45056284 #>>45056352 #>>45057115 #>>45057234 #>>45057503 #>>45057942 #>>45061686 #
1. anthem2025 ◴[] No.45057234[source]
Isn’t that why people argue against calling them hallucinations?

It implies that some parts of the output aren’t hallucinations, when the reality is that none of it has any thought behind it.