←back to thread

416 points floverfelt | 1 comments | | HN request time: 0.2s | source
Show context
sebnukem2 ◴[] No.45056066[source]
> hallucinations aren’t a bug of LLMs, they are a feature. Indeed they are the feature. All an LLM does is produce hallucinations, it’s just that we find some of them useful.

Nice.

replies(7): >>45056284 #>>45056352 #>>45057115 #>>45057234 #>>45057503 #>>45057942 #>>45061686 #
tptacek ◴[] No.45056284[source]
In that framing, you can look at an agent as simply a filter on those hallucinations.
replies(4): >>45056346 #>>45056552 #>>45056728 #>>45058056 #
1. th0ma5 ◴[] No.45056346[source]
Yes yes, with yet to be discovered holes