←back to thread

An LLM is a lossy encyclopedia

(simonwillison.net)
509 points tosh | 1 comments | | HN request time: 0.001s | source

(the referenced HN thread starts at https://news.ycombinator.com/item?id=45060519)
Show context
kgeist ◴[] No.45101806[source]
I think an LLM can be used as a kind of lossy encyclopedia, but equating it directly to one isn't entirely accurate. The human mind is also, in a sense, a lossy encyclopedia.

I prefer to think of LLMs as lossy predictors. If you think about it, natural "intelligence" itself can be understood as another type of predictor: you build a world model to anticipate what will happen next so you can plan your actions accordingly and survive.

In the real world, with countless fuzzy factors, no predictor can ever be perfectly lossless. The only real difference, for me, is that LLMs are lossier predictors than human minds (for now). That's all there is to it.

Whatever analogy you use, it comes down to the realization that there's always some lossiness involved, whether you frame it as an encyclopedia or not.

replies(6): >>45102030 #>>45102068 #>>45102070 #>>45102175 #>>45102917 #>>45103645 #
1. NoMoreNicksLeft ◴[] No.45103645[source]
Imagine having the world's most comprehensive encyclopedia at your literal fingertips, 24 hours a day, but being so lazy that you offload the hard work of thinking by letting retarded software pathologically lie to you and then blindly accepting the non-answers it spits at you rather than typing in two or three keywords to Wikipedia and skimming the top paragraph.

>I prefer to think of LLMs as lossy predictors.

I've started to call them the Great Filter.

In the latest issue of the comic book Lex Luthor attempts to exterminate humanity by hacking the LLM and having it inform humanity that they can hold their breath underwater for 17 hours.