←back to thread

418 points floverfelt | 1 comments | | HN request time: 0.207s | source
Show context
sebnukem2 ◴[] No.45056066[source]
> hallucinations aren’t a bug of LLMs, they are a feature. Indeed they are the feature. All an LLM does is produce hallucinations, it’s just that we find some of them useful.

Nice.

replies(7): >>45056284 #>>45056352 #>>45057115 #>>45057234 #>>45057503 #>>45057942 #>>45061686 #
1. jama211 ◴[] No.45061686[source]
I find it a bit of a reductive way of looking at it personally