←back to thread

443 points jaredwiener | 5 comments | | HN request time: 0s | source
Show context
rideontime ◴[] No.45032301[source]
The full complaint is horrifying. This is not equivalent to a search engine providing access to information about suicide methods. It encouraged him to share these feelings only with ChatGPT, talked him out of actions which would have revealed his intentions to his parents. Praised him for hiding his drinking, thanked him for confiding in it. It groomed him into committing suicide. https://drive.google.com/file/d/1QYyZnGjRgXZY6kR5FA3My1xB3a9...
replies(6): >>45032582 #>>45032731 #>>45035713 #>>45036712 #>>45037683 #>>45039261 #
idle_zealot ◴[] No.45032582[source]
I wonder if we can shift the framing on these issues. The LLM didn't do anything, it has no agency, it can bear no responsibility. OpenAI did these things. It is accountable for what it does, regardless of the sophistication of the tools it uses to do them, and regardless of intent. OpenAI drove a boy to suicide. More than once. The law must be interpreted this way, otherwise any action can be wrapped in machine learning to avoid accountability.
replies(10): >>45032677 #>>45032798 #>>45032857 #>>45033177 #>>45033202 #>>45035815 #>>45036475 #>>45036923 #>>45037123 #>>45039144 #
rideontime ◴[] No.45032677[source]
I completely agree and did not intend to absolve them of their guilt in any way. As far as I see it, this kid's blood is on Sam Altman's hands.
replies(1): >>45033928 #
Pedro_Ribeiro ◴[] No.45033928{3}[source]
Curious to what you would think if this kid downloaded an open source model and talked to it privately.

Would his blood be on the hands of the researchers who trained that model?

replies(5): >>45034960 #>>45034980 #>>45034991 #>>45035591 #>>45037681 #
1. idle_zealot ◴[] No.45035591{4}[source]
Then it's like cigarettes or firearms. As a distributor you're responsible for making clear the limitations, safety issues, etc, but assuming you're doing the distribution in a way that isn't overly negligent then the user becomes responsible.

If we were facing a reality in which these chat bots were being sold for $10 in the App Store, then running on end-user devices and no longer under the control of the distributors, but we still had an issue with loads of them prompting users into suicide, violence, or misleading them into preparing noxious mixtures of cleaning supplies, then we could have a discussion about exactly what extreme packaging requirements ought to be in place for distribution to be considered responsible. As is, distributed on-device models are the purview of researchers and hobbyists and don't seem to be doing any harm at all.

replies(2): >>45035749 #>>45037463 #
2. Pedro_Ribeiro ◴[] No.45035749[source]
Mhm but I don't believe inherently violent and dangerous things like guns and cigarretes are comparable to simple technology.

Should the creators of Tornado Cash be in prison for what they have enabled? You can jail them but the world can't go back, just like it can't go back when a new OSS model is released.

It is also much easier to crack down on illegal gun distribution than to figure out who uploaded the new model torrent or who deployed the latest zk innovation on Ethereum.

I don't think your hypothetical law will have the effects you think it will.

---

I also referenced this in another reply but I believe the government controlling what can go on a publicly distributed AI model is a dangerous path and probably inconstitucional.

3. rsynnott ◴[] No.45037463[source]
> but we still had an issue with loads of them prompting users into suicide, violence, or misleading them into preparing noxious mixtures of cleaning supplies, then we could have a discussion about exactly what extreme packaging requirements ought to be in place for distribution to be considered responsible.

Or, I mean, just banning sale on the basis that they're unsafe devices and unfit for purpose. Like, you can't sell, say, a gas boiler that is known to, due to a design flaw, leak CO into the room; sticking a "this will probably kill you" warning on it is not going to be sufficient.

replies(1): >>45040054 #
4. idle_zealot ◴[] No.45040054[source]
In that extreme case the "packaging requirements" would be labeling the thing not as a boiler, but as dangerous explosive scrap.
replies(1): >>45041184 #
5. rsynnott ◴[] No.45041184{3}[source]
I suspect in many places you couldn’t sell what would essentially be a device for poisoning people to consumers _at all_.