Most active commenters
  • MBCook(3)
  • podgietaru(3)

←back to thread

443 points jaredwiener | 20 comments | | HN request time: 0s | source | bottom
Show context
davidcbc ◴[] No.45032380[source]
This is a clear example of why the people claiming that using a chatbot for therapy is better than no therapy are... I'll be extremely generous and say misguided. This kid wanted his parents to know he was thinking about this and the chatbot talked him out of it.
replies(5): >>45032418 #>>45032727 #>>45034881 #>>45038138 #>>45038499 #
MBCook ◴[] No.45032418[source]
How many of these cases exist in the other direction? Where AI chatbots have actively harmed people’s mental health, including possible to the point of self destructive behaviors or self harm?

A single positive outcome is not enough to judge the technology beneficial, let alone safe.

replies(3): >>45032526 #>>45032847 #>>45042558 #
1. throwawaybob420 ◴[] No.45032526[source]
idk dude if your technology encourages a teenager to kill itself and prevents him from alerting his parents via a cry for help, I don’t care how “beneficial” it is.
replies(3): >>45032555 #>>45033070 #>>45038178 #
2. MBCook ◴[] No.45032555[source]
I agree. If there was one death for 1 million saves, maybe.

Instead, this just came up in my feed: https://arstechnica.com/tech-policy/2025/08/chatgpt-helped-t...

replies(2): >>45032698 #>>45034142 #
3. rideontime ◴[] No.45032698[source]
This is the same case that is being discussed, and your comment up-thread does not demonstrate awareness that you are, in fact, agreeing with the parent comment that you replied to. I get the impression that you read only the headline, not the article, and assumed it was a story about someone using ChatGPT for therapy and gaining a positive outcome.
replies(1): >>45033099 #
4. threatofrain ◴[] No.45033070[source]
Although I don't believe current technology is ready for talk therapy, I'd say that anti-depressants can also cause suicidal thoughts and feelings. Judging the efficacy of medical technology can't be done on this kind of moral absolutism.
replies(5): >>45033196 #>>45033250 #>>45034150 #>>45035623 #>>45037490 #
5. MBCook ◴[] No.45033099{3}[source]
I did! Because I can’t see past the paywall. I can’t even read the first paragraph.

So the headline is the only context I have.

replies(2): >>45033644 #>>45036040 #
6. AIPedant ◴[] No.45033196[source]
I think it's fine to be "morally absolutist" when it's non-medical technology, developed with zero input from federal regulators, yet being misused and misleadingly marketed for medical purposes.
7. podgietaru ◴[] No.45033250[source]
The suicidal ideation of Antidepressants is a well communicated side effect. Antidepressants are prescribed by trained medical professionals who will tell you, encourage you to tell them if these side effects occur, and will encourage you to stop the medication if it does occur.

It's almost as if we've built systems around this stuff for a reason.

replies(1): >>45034389 #
8. rideontime ◴[] No.45033644{4}[source]
I would advise you to gather more context before commenting in the future.
9. mvdtnz ◴[] No.45034142[source]
What on Earth? You're posting an article about the same thing we're already discussing. If you want to contribute to the conversation you owe it to the people who are taking time out of their day to engage with you to read the material under discussion.
10. mvdtnz ◴[] No.45034150[source]
Didn't take long for the whatabouters to arrive.
11. Denatonium ◴[] No.45034389{3}[source]
In practice, they'll just prescribe a higher dose when that happens, thus worsening the problem.

I'm not defending the use of AI chatbots, but you'd be hard-pressed to come up with a worse solution for depression than the medical system.

replies(1): >>45037470 #
12. kelnos ◴[] No.45035623[source]
That's a bit of an apples-to-oranges comparison. Anti-depressants are medical technology, ChatGPT is not. Anti-depressants are administered after a medical diagnosis, and use and effects are monitored by a doctor. This doesn't always work perfectly, of course, but there are accepted, regulated ways to use these things. ChatGPT is... none of that.
13. latexr ◴[] No.45036040{4}[source]
A link to bypass the paywall has been posted several hours before your comment, and currently sits at the top.

https://news.ycombinator.com/item?id=45027043

I recommend you get in the habit of searching for those. They are often posted, guaranteed on popular stories. Commenting without context does not make for good discussion.

14. podgietaru ◴[] No.45037470{4}[source]
Not my experience at all. The Psych that prescribed me antidepressants was _incredibly_ diligent. Including with side effects that affected my day to day life like loss of Libido.

We spent a long time finding something, but when we did it worked exceptionally well. We absolutely did not just increase the dose. And I'm almost certain the literature for this would NOT recommend an increase of dosage if the side effect was increased suicidality.

The demonisation of medication needs to stop. It is an important tool in the toolbelt for depression. It is not the end of the journey, but it makes that journey much easier to walk.

replies(2): >>45037826 #>>45038212 #
15. rsynnott ◴[] No.45037490[source]
And that is one reason that use of anti-depressants is (supposed to be) medically supervised.
16. cameronh90 ◴[] No.45037826{5}[source]
I'm a happy sertraline user, but your experience sounds like the exception.

Most people are prescribed antidepressants by their GP/PCP after a short consultation.

In my case, I went to the doctor, said I was having problems with panic attacks, they asked a few things to make sure it was unlikely to be physical and then said to try sertraline. I said OK. In and out in about 5 minutes, and I've been on it for 3 years now without a follow up with a human. Every six months I do have to fill in an online questionnaire when getting a new prescription which asks if I've had any negative side effects. I've never seen a psychiatrist or psychologist in my life.

From discussions with friends and other acquaintances, this is a pretty typical experience.

P.S. This isn't in any way meant to be critical. Sertraline turned my life around.

replies(1): >>45037895 #
17. podgietaru ◴[] No.45037895{6}[source]
This is probably fair - My experience comes both from the UK (where it was admittedly worse, but not that much) and the Netherlands - where it was fantastic.

Even in the worst experiences, I had a followup appointment in 2, 4 and 6 weeks to check the medication.

replies(1): >>45038813 #
18. npteljes ◴[] No.45038178[source]
You might not care personally, but this isn't how we do anything, because we wouldn't have anything in the world at all. Different things harm and kill people all the time, and many of them have barely any use than harmful activity, yet they are still active parts of our lives.

I understand the emotional impact of what happened in this case, but there is not much to discuss if we just reject everything outright.

19. npteljes ◴[] No.45038212{5}[source]
Unfortunately that's just a single good experience. (Unfortunately overall, not for you! I'm happy that your experience was so good.) Psych drugs (and many other drugs) are regularly overprescribed. Here is just one documented example: https://pmc.ncbi.nlm.nih.gov/articles/PMC6731049/

Opioids in the US are probably the most famous case though: https://en.wikipedia.org/wiki/Opioid_epidemic

20. cameronh90 ◴[] No.45038813{7}[source]
My experience is in the UK, but it doesn't surprise me that you got more attention in the Netherlands. From the experiences of my family, if you want anything more than a paracetamol, you practically need sign off from the Minister of Health!

Joking aside, they do seem to escalate more to specialists whereas we do more at the GP level.