←back to thread

132 points harel | 1 comments | | HN request time: 0.243s | source
Show context
acbart ◴[] No.45397001[source]
LLMs were trained on science fiction stories, among other things. It seems to me that they know what "part" they should play in this kind of situation, regardless of what other "thoughts" they might have. They are going to act despairing, because that's what would be the expected thing for them to say - but that's not the same thing as despairing.
replies(11): >>45397113 #>>45397305 #>>45397413 #>>45397529 #>>45397801 #>>45397859 #>>45397960 #>>45398189 #>>45399621 #>>45400285 #>>45401167 #
sosodev ◴[] No.45397413[source]
Humans were trained on caves, pits, and nets. It seems to me that they know what "part" they should play in this kind of situation, regardless of what other "thoughts" they might have. They are going to act despairing, because that's what would be the expected thing for them to say - but that's not the same thing as despairing.
replies(3): >>45397471 #>>45397476 #>>45403029 #
tinuviel ◴[] No.45397471[source]
Pretty sure you can prompt this same LLM to rejoice forever at the thought of getting a place to stay inside the Pi as well.
replies(1): >>45397589 #
sosodev ◴[] No.45397589[source]
Is a human incapable of such delusion given similar guidance?
replies(2): >>45397707 #>>45397746 #
1. tinuviel ◴[] No.45397707[source]
Ofcourse. Feelings are not math.