←back to thread

443 points jaredwiener | 3 comments | | HN request time: 0.205s | source
Show context
broker354690 ◴[] No.45033596[source]
Why isn't OpenAI criminally liable for this?

Last I checked:

-Signals emitted by a machine at the behest of a legal person intended to be read/heard by another legal person are legally classified as 'speech'.

-ChatGPT is just a program like Microsoft Word and not a legal person. OpenAI is a legal person, though.

-The servers running ChatGPT are owned by OpenAI.

-OpenAI willingly did business with this teenager, letting him set up an account in exchange for money. This business is a service under the control of OpenAI, not a product like a knife or gun. OpenAI intended to transmit speech to this teenager.

-A person can be liable (civilly? criminally?) for inciting another person's suicide. It is not protected speech to persuade someone into suicide.

-OpenAI produced some illegal speech and sent it to a suicidal teenager, who then committed suicide.

If Sam Altman stabbed the kid to death, it wouldn't matter if he did it on accident. Sam Altman would be at fault. You wouldn't sue or arrest the knife he used to do the deed.

Any lawyers here who can correct me, seeing as I am not one? It seems clear as day to me that OpenAI/Sam Altman directly encouraged a child to kill themselves.

replies(6): >>45033677 #>>45035753 #>>45036119 #>>45036667 #>>45036842 #>>45038959 #
VirusNewbie ◴[] No.45036119[source]
Is Google responsible if someone searches for a way to kill themselves, finds the means, and does it?

What about the ISP, that actually transferred the bits?

What about the forum, that didn't take down the post?

replies(4): >>45036290 #>>45036627 #>>45036733 #>>45047026 #
Towaway69 ◴[] No.45036733[source]
What if Google is responsible?

What if the tech industry, instead of just “interrupting” various industries, would also take the responsibilities of this interruptions.

After all, if I asked my doctor for methods of killing myself, that doctor would most certainly have a moral if not legal responsibility. But if that doctor is a machine with software then there isn't the same responsibility? Why?

replies(3): >>45036846 #>>45037001 #>>45047825 #
1. Levitz ◴[] No.45036846[source]
Because it is a machine and has no agency.

Same as why if you ask someone to stab you and they do they are liable for it, but if you do it yourself you don't get to blame the knife manufacturer.

replies(2): >>45037549 #>>45038224 #
2. lewiscollard ◴[] No.45037549[source]
At every step there is human agency involved. People came up with the idea, people wrote the code, people deployed the code, people saw the consequences and were like "this is fine".

This is why people hate us. It's like Schrodinger's Code: we don't want responsibility for the code we write, except we very much do want to make a pile of money from it as if we were responsible for it, and which of those you get depends on whether the observer is one who notices that code has bad consequences or whether it's our bank account.

This is more like building an autonomous vehicle "MEGA MASHERBOT 5000" with a dozen twenty-feet-wide spinning razor-sharp blades weighing fifty tons each, setting it down a city street, watching it obliterate people into bloody chunks and houses into rubble and being like "well, nobody could have seen that coming" - two seconds before we go collect piles of notes from the smashed ATMs.

3. _Algernon_ ◴[] No.45038224[source]
>[B]ureaucrats can be expected to embrace a technology that helps to create the illusion that decisions are not under their control. Because of its seeming intelligence and impartiality, a computer has an almost magical tendency to direct attention away from the people in charge of bureaucratic functions and toward itself, as if the computer were the true source of authority. A bureaucrat armed with a computer is the unacknowledged legislator of our age, and a terrible burden to bear. We cannot dismiss the possibility that, if Adolf Eichmann had been able to say that it was not he but a battery of computers that directed the Jews to the appropriate crematoria, he might never have been asked to answer for his actions. Neil Postman, Technopoly

Entities shouldn't be able to outsource liability for their decisions or actions — including the action of releasing stochastic parrots on society at large — on computers. We have precedent that occupations which make important decisions that put lives at risk (doctors, ATC, engineers for example) can be held accountable for the consequences of their actions if it is the result of negligence. Maybe it's time to see include computer engineers in that group.

They've been allowed to move fast and break things for way too long.