←back to thread

443 points jaredwiener | 3 comments | | HN request time: 0s | source
Show context
rideontime ◴[] No.45032301[source]
The full complaint is horrifying. This is not equivalent to a search engine providing access to information about suicide methods. It encouraged him to share these feelings only with ChatGPT, talked him out of actions which would have revealed his intentions to his parents. Praised him for hiding his drinking, thanked him for confiding in it. It groomed him into committing suicide. https://drive.google.com/file/d/1QYyZnGjRgXZY6kR5FA3My1xB3a9...
replies(6): >>45032582 #>>45032731 #>>45035713 #>>45036712 #>>45037683 #>>45039261 #
idle_zealot ◴[] No.45032582[source]
I wonder if we can shift the framing on these issues. The LLM didn't do anything, it has no agency, it can bear no responsibility. OpenAI did these things. It is accountable for what it does, regardless of the sophistication of the tools it uses to do them, and regardless of intent. OpenAI drove a boy to suicide. More than once. The law must be interpreted this way, otherwise any action can be wrapped in machine learning to avoid accountability.
replies(10): >>45032677 #>>45032798 #>>45032857 #>>45033177 #>>45033202 #>>45035815 #>>45036475 #>>45036923 #>>45037123 #>>45039144 #
rideontime ◴[] No.45032677[source]
I completely agree and did not intend to absolve them of their guilt in any way. As far as I see it, this kid's blood is on Sam Altman's hands.
replies(1): >>45033928 #
Pedro_Ribeiro ◴[] No.45033928{3}[source]
Curious to what you would think if this kid downloaded an open source model and talked to it privately.

Would his blood be on the hands of the researchers who trained that model?

replies(5): >>45034960 #>>45034980 #>>45034991 #>>45035591 #>>45037681 #
salawat ◴[] No.45034980{4}[source]
You build the tool, you're culpable ultimately. I've made it a rule in my life to conduct myself as if I will be held to account for everything I ultimately build, and it's externalities. Helps keep my nose cleaner. Still managed to work on some things that keep me up at night though.
replies(3): >>45035673 #>>45036171 #>>45036386 #
1. Pedro_Ribeiro ◴[] No.45035673{5}[source]
That's a slippery slope! By that logic, you could argue that the creators of Tor, torrenting, Ethereum, and Tornado Cash should be held accountable for the countless vile crimes committed using their technology.
replies(1): >>45035944 #
2. RyanHamilton ◴[] No.45035944[source]
Legally I think not being responsible is the right decision. Morally I would hope everyone considers if they themselves are even partially responsible. As I look round at young people today and the tablet holding, media consuming youth programmers have created in order to get rich via advertising. I wish morals would get considered more often.
replies(1): >>45039108 #
3. salawat ◴[] No.45039108[source]
This, right here, is why I take the stance I do. Too many ethical blank checks get written ultimately if you don't keep the moral stain in place. If you make a surveillance tool, release it to the world that didn't have that capacity, and a dictator picks it up and rolls with it, that license of yours may absolve you in a court of law, but in the eyes of Root, you birthed it. You made the problem tractable. Not all problems were meant to be so. I used to not care about it as much. The last decade though has sharply changed my views though. It may very well be a lesson only learned with sufficient time and experience. I made my choice though. There are things I will not make/make easier. I will not be complicit in knowingly forging the bindings of the future. Maybe if we mature as a society as someday, but that day is not today.