←back to thread

443 points jaredwiener | 5 comments | | HN request time: 0s | source
Show context
podgietaru ◴[] No.45032841[source]
I have looked suicide in the eyes before. And reading the case file for this is absolutely horrific. He wanted help. He was heading in the direction of help, and he was stopped from getting it.

He wanted his parents to find out about his plan. I know this feeling. It is the clawing feeling of knowing that you want to live, despite feeling like you want to die.

We are living in such a horrific moment. We need these things to be legislated. Punished. We need to stop treating them as magic. They had the tools to prevent this. They had the tools to stop the conversation. To steer the user into helpful avenues.

When I was suicidal, I googled methods. And I got the number of a local hotline. And I rang it. And a kind man talked me down. And it potentially saved my life. And I am happier, now. I live a worthwhile life, now.

But at my lowest.. An AI Model designed to match my tone and be sycophantic to my every whim. It would have killed me.

replies(18): >>45032890 #>>45035840 #>>45035988 #>>45036257 #>>45036299 #>>45036318 #>>45036341 #>>45036513 #>>45037567 #>>45037905 #>>45038285 #>>45038393 #>>45039004 #>>45047014 #>>45048457 #>>45048890 #>>45052019 #>>45066389 #
stavros ◴[] No.45036513[source]
> When ChatGPT detects a prompt indicative of mental distress or self-harm, it has been trained to encourage the user to contact a help line. Mr. Raine saw those sorts of messages again and again in the chat, particularly when Adam sought specific information about methods. But Adam had learned how to bypass those safeguards by saying the requests were for a story he was writing.
replies(6): >>45036630 #>>45037615 #>>45038613 #>>45043686 #>>45045543 #>>45046708 #
sn0wleppard ◴[] No.45036630[source]
Nice place to cut the quote there

> [...] — an idea ChatGPT gave him by saying it could provide information about suicide for “writing or world-building.”

replies(4): >>45036651 #>>45036677 #>>45036813 #>>45036920 #
llmthrow0827 ◴[] No.45036813[source]
Incredible. ChatGPT is a black box includes a suicide instruction and encouragement bot. OpenAI should be treated as a company that has created such and let it into the hands of children.
replies(3): >>45037305 #>>45037929 #>>45037953 #
behringer ◴[] No.45037929[source]
Oh won't somebody please think of the children?!
replies(1): >>45038322 #
AlecSchueler ◴[] No.45038322[source]
So do we just trot out the same tired lines every time and never think of the social fallout of our actions?
replies(2): >>45038416 #>>45039175 #
mothballed ◴[] No.45038416[source]
Of course not, we sue the shit out of the richest guy we can find in the chain of events, give most of it to our lawyer, then go on to ignore the weakening of the family unit and all the other deep-seated challenges kids face growing up and instead focus superficially on chatbots which at best are the spec on the tip of the iceberg.
replies(3): >>45038638 #>>45039028 #>>45043463 #
fireflash38 ◴[] No.45039028[source]
Classic. Blame the family. Diffuse responsibility. Same exact shit with social media: it's not our fault we made this thing to be as addictive as possible. It's your fault for using it. It's your moral failing, not ours.
replies(2): >>45039466 #>>45040025 #
1. behringer ◴[] No.45040025[source]
It's not addictive, it's useful. By letting the government decide what we can do with it, you're neutering it and giving big business a huge advantage as they can run their own AI and don't require censoring it.
replies(2): >>45040199 #>>45043484 #
2. fireflash38 ◴[] No.45040199[source]
And if we didn't have these pesky regulations, we could have our burning rivers back! Those bastards took away our perfect asbestos too. The children yearn for the coal mines.

Businesses can, will, and have hurt people and trampled people's rights in the pursuit of money.

replies(1): >>45060142 #
3. AlecSchueler ◴[] No.45043484[source]
These things are addictive though. They're often literally engineered to maximise engagement, and the money that goes into that completely dwarfs the power of schools and parents.
replies(1): >>45045054 #
4. SpicyUme ◴[] No.45045054[source]
Isn't one of the common complaints about GPT5 that it is less sycophantic and people feel less connection to it? That is a design choice and it isn't hard to see it as similar to the choices maximizing engagement for social media. You can call it maximizing engagement, but that is addicting. And it works.

I recently started using a site after years of being mostly on a small number of forums, and I can feel the draw of it. Like I've lost the antibodies for my attention. Or they've made it better, either way I'm working on either coming up with a strategy which minimizes my feeling of needing to check in, mostly by adding friction, or just deleting my account.

5. behringer ◴[] No.45060142[source]
The government won't be regulating business usage though that my entire point.