←back to thread

165 points distalx | 2 comments | | HN request time: 0s | source
Show context
sheepscreek ◴[] No.43949463[source]
That’s fair but there’s another nuance that they can’t solve for. Cost and availability.

AI is not a substitute for traditional therapy, but it offers an 80% benefit at a fraction of the cost. It could be used to supplement therapy, for the periods between sessions.

The biggest risk is with privacy. Meta could not be trusted knowing what you’re going to wear or eat. Now imagine them knowing your deepest darkest secrets. The advertising business model does not gel well with providing mental health support. Subscription (with privacy guarantees) is the way to go.

replies(5): >>43949589 #>>43949591 #>>43950064 #>>43950278 #>>43950547 #
sarchertech ◴[] No.43949589[source]
Does it offer 80% of the benefit? An AI could match what a human therapist would say 80% (or 99%) of the time and still provide negative benefit.

Therapy seems like the last place an LLM would be beneficial because it’s very hard to keep an LLM from telling you what you want to hear. I can see anyway you could guarantee that a chatbot cause severe damage to a vulnerable patient by supporting their neurosis.

We’re not anywhere close to an LLM which is trained to be supportive and understanding in tone but will never affirm your irrational fears, insecurities, and delusions.

replies(2): >>43949648 #>>43949858 #
pitched ◴[] No.43949648[source]
Sometimes, the process of gathering our thoughts enough to article them into a prompt is where the benefit is. AI as the rubber duck has a lot of value. Understanding that this is what’s needed vs. something deeper, is beyond the scope of what AI can handle.
replies(2): >>43949682 #>>43949868 #
1. sarchertech ◴[] No.43949682[source]
And that’s fine as long as the person using it has a sophisticated understanding of the technology and a company isn’t selling it as a “therapist”.

When an AI therapist from a health startup confirms that a mentally disturbed person is indeed hearing voices from God, or an insecure teenager uses meta AI as a therapist because Mark Zuckerberg said they should and it agrees with them that yes they are unloveable, then we have a problem.

replies(1): >>43949809 #
2. pitched ◴[] No.43949809[source]
That last 20% of “missing nuance” is really important if someone is in that state! For the rest of us, the value of an AI therapist roughly matches journaling.