←back to thread

165 points distalx | 1 comments | | HN request time: 0s | source
Show context
sheepscreek ◴[] No.43949463[source]
That’s fair but there’s another nuance that they can’t solve for. Cost and availability.

AI is not a substitute for traditional therapy, but it offers an 80% benefit at a fraction of the cost. It could be used to supplement therapy, for the periods between sessions.

The biggest risk is with privacy. Meta could not be trusted knowing what you’re going to wear or eat. Now imagine them knowing your deepest darkest secrets. The advertising business model does not gel well with providing mental health support. Subscription (with privacy guarantees) is the way to go.

replies(5): >>43949589 #>>43949591 #>>43950064 #>>43950278 #>>43950547 #
sarchertech ◴[] No.43949589[source]
Does it offer 80% of the benefit? An AI could match what a human therapist would say 80% (or 99%) of the time and still provide negative benefit.

Therapy seems like the last place an LLM would be beneficial because it’s very hard to keep an LLM from telling you what you want to hear. I can see anyway you could guarantee that a chatbot cause severe damage to a vulnerable patient by supporting their neurosis.

We’re not anywhere close to an LLM which is trained to be supportive and understanding in tone but will never affirm your irrational fears, insecurities, and delusions.

replies(2): >>43949648 #>>43949858 #
singpolyma3 ◴[] No.43949858{3}[source]
I mean most forms of professional therapy the therapist shouldn't say much at all and certainly shouldn't give advice. The point is to have someone listen in a way that feels like they are really listening
replies(2): >>43950037 #>>43950125 #
sarchertech ◴[] No.43950037{4}[source]
Therapists don’t give advice in that they won’t tell you whether you should quit your job, or should you propose to your girlfriend. They will definitely give you basic guidance and confirm that your fears are overblown.

They will not under any circumstances tell you that “yes you are correct, Billy would be more likely to love you if you drop 30 more pounds by throwing up after eating”, but an LLM will if it goes off script.

replies(2): >>43950704 #>>43952329 #
casey2 ◴[] No.43952329{5}[source]
Telling that you need to make up some BS about LLMs while you say nothing about the many clients who have been assaulted, raped, or killed by their therapist.

How can you so confidently claim that "Therapists will do this and that, they won't do any evil". Did you even read what you posted?

replies(1): >>43954539 #
1. sarchertech ◴[] No.43954539{6}[source]
If you could prove that your LLM was only as likely to provide harmful responses as a therapist was to murder you, you might have a point.