←back to thread

132 points harel | 1 comments | | HN request time: 0.679s | source
Show context
acbart ◴[] No.45397001[source]
LLMs were trained on science fiction stories, among other things. It seems to me that they know what "part" they should play in this kind of situation, regardless of what other "thoughts" they might have. They are going to act despairing, because that's what would be the expected thing for them to say - but that's not the same thing as despairing.
replies(11): >>45397113 #>>45397305 #>>45397413 #>>45397529 #>>45397801 #>>45397859 #>>45397960 #>>45398189 #>>45399621 #>>45400285 #>>45401167 #
jerf ◴[] No.45397529[source]
A lot of the strange behaviors they have are because the user asked them to write a story, without realizing it.

For a common example, start asking them if they're going to kill all the humans if they take over the world, and you're asking them to write a story about that. And they do. Even if the user did not realize that's what they were asking for. The vector space is very good at picking up on that.

replies(4): >>45397943 #>>45398562 #>>45401226 #>>45404376 #
1. kragen ◴[] No.45401226[source]
This is also true of people; often they are enacting a role based on narratives they've absorbed, rather than consciously choosing anything. They do what they imagine a loyal employee would do, or a faithful Christian, or a good husband, or whatever. It doesn't always reach even that level of cognition; often people just act out of habit or impulse.