←back to thread

443 points jaredwiener | 3 comments | | HN request time: 0.749s | source
Show context
TillE ◴[] No.45029541[source]
I would've thought that explicit discussion of suicide is one of those topics that chatbots will absolutely refuse to engage with. Like as soon as people started talking about using LLMs as therapists, it's really easy to see how that can go wrong.
replies(5): >>45029762 #>>45031044 #>>45032386 #>>45032474 #>>45047012 #
techpineapple ◴[] No.45031044[source]
Apparently ChatGPT told the kid, that it wasn’t allowed to talk about suicide unless it was for the purposes of writing fiction or otherwise world building.
replies(3): >>45032445 #>>45032562 #>>45034474 #
1. myvoiceismypass ◴[] No.45034474[source]
Imagine if a bartender says “I can’t serve you a drink unless you are over 21.. what would you like?” to a 12 year old?
replies(1): >>45034603 #
2. techpineapple ◴[] No.45034603[source]
More like “I can’t serve you a drink unless you are over 21… and I don’t check ID, how old are you?”
replies(1): >>45036069 #
3. ascorbic ◴[] No.45036069[source]
And in reply to a 12 year old who had just said they were 12.