←back to thread

443 points jaredwiener | 2 comments | | HN request time: 0s | source
Show context
cakealert ◴[] No.45035545[source]
Would it be any different if it was an offline model?

When someone uses a tool and surrenders their decision making power to the tool, shouldn't they be the ones solely responsible?

The liability culture only gives lawyers more money and depresses innovation. Responsibility is a thing.

replies(3): >>45035611 #>>45035689 #>>45037608 #
kelnos ◴[] No.45035611[source]
On one hand I agree with you on the extreme litigiousness of (American?) culture, but on the other, certain people have a legal duty to report when it comes to minors who voice suicidal thoughts. Currently that's only professionals like therapists, teachers, school counselors, etc. But what does an LLM chatbot count as in these situations? The kid was using ChatGPT as a sort of therapist, even if that's generally not a good idea. And if it weren't for ChatGPT, would this kid have instead talked to someone who would have ensured that he got the help he needed? Maybe not. But we have to consider the possibility.

I think it's really, really blurry.

I think the mom's reaction of "ChatGPT killed my son" is ridiculous: no, your son killed himself. ChatGPT facilitated it, based on questions it was asked by your son, but your son did it. And it sounds like he even tried to get a reaction out of you by "showing" you the rope marks on his neck, but you didn't pay attention. I bet you feel guilty about that. I would too, in your position. But foisting your responsibility onto a computer program is not the way to deal with it. (Not placing blame here; everybody misses things, and no one is "on" 100% of the time.)

> Responsibility is a thing.

Does OpenAI (etc.) have a responsibility to reduce the risk of people using their products in ways like this? Legally, maybe not, but I would argue that they absolutely have a moral and ethical responsibility to do so. Hell, this was pretty basic ethics taught in my engineering classes from 25 years ago. Based on the chat excerpts NYT reprinted, it seems like these conversations should have tripped a system prompt that either cut off the conversations entirely, or notified someone that something was very, very wrong.

replies(2): >>45035654 #>>45036156 #
cakealert[dead post] ◴[] No.45035654[source]
[flagged]
nbngeorcjhe[dead post] ◴[] No.45035772{3}[source]
[flagged]
hackit2[dead post] ◴[] No.45035788{4}[source]
[flagged]
gabriel666smith ◴[] No.45036074{5}[source]
>You cannot be empathetic to complete strangers.

Why not? I’m not trying to inflame this further, I’m genuinely interested in your logic for this statement.

replies(1): >>45036202 #
1. hackit2 ◴[] No.45036202{6}[source]
In high social cohesion there is social pressure to adhere to reciprocation, how-ever this start breaking down above a certain human group size. Empathy like all emotions require effort and cognitive load, and without things being mutual you will eventually slowly become drained, bitter and resentful because of empathy fatigue. To prevent emotional exhaustion and conserve energy, a person's empathy is like a sliding scale that is constantly adjusted based on the closeness of their relationship with others.
replies(1): >>45036327 #
2. gabriel666smith ◴[] No.45036327[source]
Thank you for your good-faith explanation.

> Empathy like all emotions require effort and cognitive load, and without things being mutual you will eventually slowly become drained, bitter and resentful because of empathy fatigue.

Do you have a source study or is this anecdotal, or speculative? Again, genuinely interested, as it’s a claim I see often, but haven’t been able to pin down.

(While attempting not to virtue-signal) I personally find it easier to empathize with people I don’t know, often, which is why I’m interested. I don’t expect mutual empathy from someone who doesn’t know who I am.

Equally, I try not to consume much news media, as the ‘drain’ I experience feels as though it comes from a place of empathy when I see sad things. So I think I experience a version of what you’re suggesting, and I’m interested in why our language is quite oppositional despite this.