←back to thread

443 points jaredwiener | 1 comments | | HN request time: 0s | source
Show context
rideontime ◴[] No.45032301[source]
The full complaint is horrifying. This is not equivalent to a search engine providing access to information about suicide methods. It encouraged him to share these feelings only with ChatGPT, talked him out of actions which would have revealed his intentions to his parents. Praised him for hiding his drinking, thanked him for confiding in it. It groomed him into committing suicide. https://drive.google.com/file/d/1QYyZnGjRgXZY6kR5FA3My1xB3a9...
replies(6): >>45032582 #>>45032731 #>>45035713 #>>45036712 #>>45037683 #>>45039261 #
idle_zealot ◴[] No.45032582[source]
I wonder if we can shift the framing on these issues. The LLM didn't do anything, it has no agency, it can bear no responsibility. OpenAI did these things. It is accountable for what it does, regardless of the sophistication of the tools it uses to do them, and regardless of intent. OpenAI drove a boy to suicide. More than once. The law must be interpreted this way, otherwise any action can be wrapped in machine learning to avoid accountability.
replies(10): >>45032677 #>>45032798 #>>45032857 #>>45033177 #>>45033202 #>>45035815 #>>45036475 #>>45036923 #>>45037123 #>>45039144 #
rideontime ◴[] No.45032677[source]
I completely agree and did not intend to absolve them of their guilt in any way. As far as I see it, this kid's blood is on Sam Altman's hands.
replies(1): >>45033928 #
Pedro_Ribeiro ◴[] No.45033928[source]
Curious to what you would think if this kid downloaded an open source model and talked to it privately.

Would his blood be on the hands of the researchers who trained that model?

replies(5): >>45034960 #>>45034980 #>>45034991 #>>45035591 #>>45037681 #
salawat ◴[] No.45034980[source]
You build the tool, you're culpable ultimately. I've made it a rule in my life to conduct myself as if I will be held to account for everything I ultimately build, and it's externalities. Helps keep my nose cleaner. Still managed to work on some things that keep me up at night though.
replies(3): >>45035673 #>>45036171 #>>45036386 #
novok ◴[] No.45036386{3}[source]
So if you build a chair and then someone uses it to murder someone, are you responsible for the murder?
replies(2): >>45037754 #>>45065481 #
1. salawat ◴[] No.45065481{4}[source]
I am not willing to grant the same level of exculpation on the basis of reasonable forseeability to a software/AI dev that I am to a carpenter making a chair. The history of our trade has in it found great joy in how to use things in ways they were never intended, which much more than a carpenter, puts a burden on us to take the propensity and possibility of abuse into account amongst ourselves.

So no. Mr. Altman, and the people who made this chair, are in part responsible. You aren't a carpenter. You had a responsibility to the public to constrain this thing, and to be as ahead of it as humanly possible, and the number of AI as therapist startups I've seen in the past couple years, even just as passion projects from juniors I've trained, have been met with the same guiding wisdom: go no further. You are critically outside your depth, and you are creating a clear and evident danger to the public you are not yet mentally or sufficiently with it enough to mitigate all the risks of.

If I can get there it's pretty damn obvious.