←back to thread

An LLM is a lossy encyclopedia

(simonwillison.net)
509 points tosh | 1 comments | | HN request time: 0.213s | source

(the referenced HN thread starts at https://news.ycombinator.com/item?id=45060519)
1. sp1982 ◴[] No.45103384[source]
I did a similar experiment and found that GPT5 hallucinates upto 20% in domains like cricket stats where there is too much info to memorize. However interestingly the mini version refuses to answer most of the time which is a better approach imho. https://kaamvaam.com/machine-learning-ai/llm-eval-hallucinat...