←back to thread

An LLM is a lossy encyclopedia

(simonwillison.net)
509 points tosh | 3 comments | | HN request time: 0s | source

(the referenced HN thread starts at https://news.ycombinator.com/item?id=45060519)
Show context
latexr ◴[] No.45101170[source]
A lossy encyclopaedia should be missing information and be obvious about it, not making it up without your knowledge and changing the answer every time.

When you have a lossy piece of media, such as a compressed sound or image file, you can always see the resemblance to the original and note the degradation as it happens. You never have a clear JPEG of a lamp, compress it, and get a clear image of the Milky Way, then reopen the image and get a clear image of a pile of dirt.

Furthermore, an encyclopaedia is something you can reference and learn from without a goal, it allows you to peruse information you have no concept of. Not so with LLMs, which you have to query to get an answer.

replies(10): >>45101190 #>>45101267 #>>45101510 #>>45101793 #>>45101924 #>>45102219 #>>45102694 #>>45104357 #>>45108609 #>>45112011 #
mock-possum ◴[] No.45104357[source]
Yeah an LLM is an unreliable librarian, if anything.
replies(1): >>45108783 #
1. latexr ◴[] No.45108783[source]
That’s a much better analogy. You have to specifically ask them for information and they will happily retrieve it for you, but because they are unreliable they may get you the wrong thing. If you push back they’ll apologise and try again (librarians try to be helpful) but might again give you the wrong thing (you never know, because they are unreliable).
replies(1): >>45113740 #
2. vrighter ◴[] No.45113740[source]
There's a big difference between giving you correct information about the wrong thing, vs giving you incorrect information about the right thing.

A librarian might bring you the wrong book, that's the former. An LLM does the latter. They are not the same.

replies(1): >>45115190 #
3. latexr ◴[] No.45115190[source]
Fair. With the unreliable librarian you’d be at an advantage because you’d immediately see “this is not what I asked for”, which is not the case with LLMs (and hence what makes them so problematic).