←back to thread

443 points jaredwiener | 1 comments | | HN request time: 0.001s | source
Show context
davidcbc ◴[] No.45032380[source]
This is a clear example of why the people claiming that using a chatbot for therapy is better than no therapy are... I'll be extremely generous and say misguided. This kid wanted his parents to know he was thinking about this and the chatbot talked him out of it.
replies(5): >>45032418 #>>45032727 #>>45034881 #>>45038138 #>>45038499 #
MBCook ◴[] No.45032418[source]
How many of these cases exist in the other direction? Where AI chatbots have actively harmed people’s mental health, including possible to the point of self destructive behaviors or self harm?

A single positive outcome is not enough to judge the technology beneficial, let alone safe.

replies(3): >>45032526 #>>45032847 #>>45042558 #
throwawaybob420 ◴[] No.45032526[source]
idk dude if your technology encourages a teenager to kill itself and prevents him from alerting his parents via a cry for help, I don’t care how “beneficial” it is.
replies(3): >>45032555 #>>45033070 #>>45038178 #
1. npteljes ◴[] No.45038178[source]
You might not care personally, but this isn't how we do anything, because we wouldn't have anything in the world at all. Different things harm and kill people all the time, and many of them have barely any use than harmful activity, yet they are still active parts of our lives.

I understand the emotional impact of what happened in this case, but there is not much to discuss if we just reject everything outright.