←back to thread

443 points jaredwiener | 1 comments | | HN request time: 0.22s | source
Show context
lvl155 ◴[] No.45032596[source]
Clearly ChatGPT should not be used for this purpose but I will say this industry (counseling) is also deeply flawed. They are also mis-incentivized in many parts of the world. And if ChatGPT is basing its interactions on the same scripted contents these “professionals” use, that’s just not right.

I really wish people in AI space stop the nonsense and communicate more clearly what these LLMs are designed to do. They’re not some magical AGI. They’re token prediction machines. That’s literally how they should frame it so gen pop knows exactly what they’re getting.

replies(5): >>45032792 #>>45032867 #>>45036104 #>>45037595 #>>45041668 #
podgietaru ◴[] No.45032867[source]
Counseling is (or should be) heavily regulated, and if a counselor had given advice about the logistics of whether a noose would hold it's weight, they'd probably be prosecuted.

They allowed this. They could easily stop conversations about suicide. They have the technology to do that.

replies(1): >>45033079 #
1. fatbird ◴[] No.45033079[source]
Counseling is a very heavily regulated field. They're considered health care professionals, they're subject to malpractice, and they're certified by professional bodies (which is legally required, and insurance coverage is usually dependent upon licencing status).