←back to thread

225 points martinald | 2 comments | | HN request time: 0.503s | source
Show context
ryao ◴[] No.44538755[source]
Am I the only one who thinks mention of “safety tests” for LLMs is a marketing scheme? Cars, planes and elevators have safety tests. LLMs don’t. Nobody is going to die if a LLM gives an output that its creators do not like, yet when they say “safety tests”, they mean that they are checking to what extent the LLM will say things they do not like.
replies(12): >>44538785 #>>44538805 #>>44538808 #>>44538903 #>>44538929 #>>44539030 #>>44539924 #>>44540225 #>>44540905 #>>44542283 #>>44542952 #>>44543574 #
halfjoking ◴[] No.44539924[source]
It's overblown. Elon shipped Hitler grok straight to prod

Nobody died

replies(1): >>44540609 #
1. pona-a ◴[] No.44540609[source]
Playing devil's advocate, what if it was more subtle?

Prolonged use of conversational programs does reliably induce certain mental states in vulnerable populations. When ChatGPT got a bit too agreeable, that was enough for a man to kill himself in a psychotic episode [1]. I don't think this magnitude of delusion was possible with ELIZA, even if the fundamental effect remains the same.

Could this psychosis be politically weaponized by biasing the model to include certain elements in its responses? We know this rhetoric works: cults have been using love-bombing, apocalypticism, us-vs-them dynamics, assigned special missions, and isolation from external support systems to great success. What we haven't seen is what happens when everyone has a cult recruiter in their pocket, waiting for a critical moment to offer support.

ChatGPT has an estimated 800 million weekly active users [2]. How many of them would be vulnerable to indoctrination? About 3% of the general population has been involved in a cult [3], but that might be a reflection of conversion efficiency, not vulnerability. Even assuming 5% are vulnerable, that's still 40 million people ready to sacrifice their time, possessions, or even their lives in their delusion.

[1] https://www.rollingstone.com/culture/culture-features/chatgp...

[2] https://www.forbes.com/sites/martineparis/2025/04/12/chatgpt...

[3] https://www.peopleleavecults.com/post/statistics-on-cults

replies(1): >>44543610 #
2. stogot ◴[] No.44543610[source]
You’re worried about indoctrination in an LLM but it starts much earlier than that. The school system is indoctrination of our youngest minds, both today in the West and its Prussian origins

https://today.ucsd.edu/story/education-systems-were-first-de...

We should fix both systems. I don’t want Altman’s or Musk’s opinions indoctrinating