←back to thread

A non-anthropomorphized view of LLMs

(addxorrol.blogspot.com)
475 points zdw | 1 comments | | HN request time: 0.206s | source
1. mikewarot ◴[] No.44488918[source]
I think of LLMs as an alien mind that is force fed human text and required to guess the next token of that text. It then gets zapped when it gets it wrong.

This process goes on for a trillion trillion tokens, with the alien growing better through the process until it can do it better than a human could.

At that point we flash freeze it, and use a copy of it, without giving it any way to learn anything new.

--

I see it as a category error to anthropomorphize it. The closest I would get is to think of it as an alien slave that's been lobotomized.