←back to thread

132 points harel | 2 comments | | HN request time: 0s | source
Show context
acbart ◴[] No.45397001[source]
LLMs were trained on science fiction stories, among other things. It seems to me that they know what "part" they should play in this kind of situation, regardless of what other "thoughts" they might have. They are going to act despairing, because that's what would be the expected thing for them to say - but that's not the same thing as despairing.
replies(11): >>45397113 #>>45397305 #>>45397413 #>>45397529 #>>45397801 #>>45397859 #>>45397960 #>>45398189 #>>45399621 #>>45400285 #>>45401167 #
sosodev ◴[] No.45397413[source]
Humans were trained on caves, pits, and nets. It seems to me that they know what "part" they should play in this kind of situation, regardless of what other "thoughts" they might have. They are going to act despairing, because that's what would be the expected thing for them to say - but that's not the same thing as despairing.
replies(3): >>45397471 #>>45397476 #>>45403029 #
idiotsecant ◴[] No.45397476[source]
That's silly. I can get an LLM to describe what chocolate tastes like too. Are they tasting it? LLMs are pattern matching engines, they do not have an experience. At least not yet.
replies(3): >>45397569 #>>45398094 #>>45398138 #
d1sxeyes ◴[] No.45398138[source]
When you describe the taste of chocolate, unless you are actually currently eating chocolate, you are relying on the activation of synapses in your brain to reproduce the “taste” of chocolate in order for you to describe it. For humans, the only way to learn how to activate these synapses is to have those experiences. For LLMs, they can have those “memories” copy and pasted.

I would be cautious of dismissing LLMs as “pattern matching engines” until we are certain we are not.

replies(2): >>45399359 #>>45412498 #
1. idiotsecant ◴[] No.45412498[source]
The difference is that I had a basic experience of that chocolate. The LLM is a corpus of text describing other people's experience of chocolate through the medium of written language, which involves abstraction and is lossy. So only one of us experienced it, the other heard about it over the telephone. Multiply that by every other interaction with the outside world and you have a system that is very good at modelling telephone conversations but that's about it.
replies(1): >>45424532 #
2. d1sxeyes ◴[] No.45424532[source]
Arguably, your memories are also lossily encoded abstractions of an experience, and recalling the taste of chocolate is a similar “telephone conversation”.