←back to thread

A non-anthropomorphized view of LLMs

(addxorrol.blogspot.com)
477 points zdw | 1 comments | | HN request time: 0s | source
Show context
djoldman ◴[] No.44485387[source]
Let's skip to the punchline. Using TFA's analogy: essentially folks are saying not that this is a set of dice rolling around making words. It's a set of dice rolling around where someone attaches those dice to the real world where if the dice land on 21, the system kills a chicken, or a lot worse.

Yes it's just a word generator. But then folks attach the word generator to tools where it can invoke the use of tools by saying the tool name.

So if the LLM says "I'll do some bash" then it does some bash. It's explicitly linked to program execution that, if it's set up correctly, can physically affect the world.

replies(2): >>44485415 #>>44485422 #
1. 3cats-in-a-coat ◴[] No.44485415[source]
Given our entire civilization is built on words, all of it, it's shocking how poorly most of us understand their importance and power.