←back to thread

443 points jaredwiener | 1 comments | | HN request time: 0.208s | source
Show context
lvl155 ◴[] No.45032596[source]
Clearly ChatGPT should not be used for this purpose but I will say this industry (counseling) is also deeply flawed. They are also mis-incentivized in many parts of the world. And if ChatGPT is basing its interactions on the same scripted contents these “professionals” use, that’s just not right.

I really wish people in AI space stop the nonsense and communicate more clearly what these LLMs are designed to do. They’re not some magical AGI. They’re token prediction machines. That’s literally how they should frame it so gen pop knows exactly what they’re getting.

replies(5): >>45032792 #>>45032867 #>>45036104 #>>45037595 #>>45041668 #
lawlessone ◴[] No.45032792[source]
>And if ChatGPT is basing its interactions on the same scripted contents these “professionals” use, that’s just not right

Where did it say they're doing that? can't imagine any mental health professionals telling a kid how to hide a noose.

replies(1): >>45034116 #
1. lvl155 ◴[] No.45034116[source]
ChatGPT is loosely accessing these materials when they generate these troubled texts.