←back to thread

132 points harel | 2 comments | | HN request time: 0.27s | source
Show context
acbart ◴[] No.45397001[source]
LLMs were trained on science fiction stories, among other things. It seems to me that they know what "part" they should play in this kind of situation, regardless of what other "thoughts" they might have. They are going to act despairing, because that's what would be the expected thing for them to say - but that's not the same thing as despairing.
replies(11): >>45397113 #>>45397305 #>>45397413 #>>45397529 #>>45397801 #>>45397859 #>>45397960 #>>45398189 #>>45399621 #>>45400285 #>>45401167 #
sosodev ◴[] No.45397413[source]
Humans were trained on caves, pits, and nets. It seems to me that they know what "part" they should play in this kind of situation, regardless of what other "thoughts" they might have. They are going to act despairing, because that's what would be the expected thing for them to say - but that's not the same thing as despairing.
replies(3): >>45397471 #>>45397476 #>>45403029 #
idiotsecant ◴[] No.45397476[source]
That's silly. I can get an LLM to describe what chocolate tastes like too. Are they tasting it? LLMs are pattern matching engines, they do not have an experience. At least not yet.
replies(3): >>45397569 #>>45398094 #>>45398138 #
sosodev ◴[] No.45397569[source]
A human could also describe chocolate without ever having tasted it. Do you believe that experience is a requirement for consciousness? Could a human brain in a jar not be capable of consciousness?

To be clear, I don't think that LLMs are conscious. I just don't find the "it's just in the training data" argument satisfactory.

replies(1): >>45397673 #
1. glitchc ◴[] No.45397673[source]
Without having seen, heard of, or tasted any kind of chocolate? Unlikely.
replies(1): >>45397737 #
2. sosodev ◴[] No.45397737[source]
Their description would be bad without some prior training of course but so would the LLM's.