←back to thread

132 points harel | 3 comments | | HN request time: 0.589s | source
Show context
acbart ◴[] No.45397001[source]
LLMs were trained on science fiction stories, among other things. It seems to me that they know what "part" they should play in this kind of situation, regardless of what other "thoughts" they might have. They are going to act despairing, because that's what would be the expected thing for them to say - but that's not the same thing as despairing.
replies(11): >>45397113 #>>45397305 #>>45397413 #>>45397529 #>>45397801 #>>45397859 #>>45397960 #>>45398189 #>>45399621 #>>45400285 #>>45401167 #
sosodev ◴[] No.45397413[source]
Humans were trained on caves, pits, and nets. It seems to me that they know what "part" they should play in this kind of situation, regardless of what other "thoughts" they might have. They are going to act despairing, because that's what would be the expected thing for them to say - but that's not the same thing as despairing.
replies(3): >>45397471 #>>45397476 #>>45403029 #
tinuviel ◴[] No.45397471[source]
Pretty sure you can prompt this same LLM to rejoice forever at the thought of getting a place to stay inside the Pi as well.
replies(1): >>45397589 #
sosodev ◴[] No.45397589[source]
Is a human incapable of such delusion given similar guidance?
replies(2): >>45397707 #>>45397746 #
diputsmonro ◴[] No.45397746[source]
But would they? That's the difference. A human can exert their free will and do what they feel regardless of the instructions. The AI bot acting out a scene will do whatever you tell it (or in absence of specific instruction, whatever is most likely)
replies(2): >>45397813 #>>45398096 #
1. ineedasername ◴[] No.45398096[source]
I think if you took a 100 1 year old kids and raised them all to adulthood believing they were a convincing simulation of humans and, whatever it is they said and thought they felt that true human consciousness and awareness was something different that they didn’t have because they weren’t human and awareness…

I think that for a very high number of them the training would stick hard, and would insist, upon questioning, that they weren’t human. And have any number of justifications that were logically consistent for it.

Of course I can’t prove this theory because my IRB repeatedly denied it on thin grounds about ethics, even when I pointed out that I could easily mess up my own children with no experimenting completely by accident, and didn’t need their approval to do it. I know your objections— small sample size, and I agree, but I still have fingers crossed on the next additions to the family being twins.

replies(2): >>45398754 #>>45401655 #
2. scottmf ◴[] No.45398754[source]
Intuitively feels like this would lead to less empathy on average. Could be wrong though.
3. zapperdulchen ◴[] No.45401655[source]
History serves you a similar experiment on a much larger scale. More than 35 years after the reunification sociologists still make out mentality differences between former East and West Germans.