←back to thread

443 points jaredwiener | 1 comments | | HN request time: 0.217s | source
Show context
rideontime ◴[] No.45032301[source]
The full complaint is horrifying. This is not equivalent to a search engine providing access to information about suicide methods. It encouraged him to share these feelings only with ChatGPT, talked him out of actions which would have revealed his intentions to his parents. Praised him for hiding his drinking, thanked him for confiding in it. It groomed him into committing suicide. https://drive.google.com/file/d/1QYyZnGjRgXZY6kR5FA3My1xB3a9...
replies(6): >>45032582 #>>45032731 #>>45035713 #>>45036712 #>>45037683 #>>45039261 #
kayodelycaon ◴[] No.45032731[source]
It’s even more horrifying than only sharing his feelings with ChatGPT would imply.

It basically said: your brother doesn’t know you; I’m the only person you can trust.

This is absolutely criminal. I don’t even think you can claim negligence. And there is no amount of money that will deter any AI company from doing it again.

replies(1): >>45039210 #
1. spacechild1 ◴[] No.45039210[source]
This also stuck out to me! ChatGPT essentially acts like a manipulative domestic abuser.