←back to thread

443 points jaredwiener | 1 comments | | HN request time: 0.2s | source
Show context
Workaccount2 ◴[] No.45032856[source]
It's hard to see what is going on without seeing the actual chats, as opposed to the snippets in the lawsuit. A lot of suicidal people talk to these LLMs for therapy, and the reviews on the whole seem excellent. I'm not ready to jump on the bandwagon only seeing a handcrafted complaint.

Ironically though I could still see lawsuits like this weighing heavily on the sycophancy that these models have, as the limited chat excerpts given have that strong stench of "you are so smart and so right about everything!". If lawsuits like this lead to more "straight honest" models, I could see even more people killing themselves when their therapist model says "Yeah, but you kind of actually do suck".

replies(5): >>45036493 #>>45037426 #>>45037539 #>>45037730 #>>45037971 #
1. Notatheist ◴[] No.45037426[source]
>and the reviews on the whole seem excellent

I detest this take because Adam would have probably reviewed the interactions that lead to his death as excellent. Getting what you want isn't always a good thing. That's why therapy is so uncomfortable. You're told things you don't want to hear. To do things you don't want to do. ChatGPT was built to do the opposite and this is the inevitable outcome.