←back to thread

371 points ulrischa | 2 comments | | HN request time: 0.416s | source
Show context
t_mann ◴[] No.43235506[source]
Hallucinations themselves are not even the greatest risk posed by LLMs. A much greater risk (in simple terms of probability times severity) I'd say is that chat bots can talk humans into harming themselves or others. Both of which have already happened, btw [0,1]. Still not sure if I'd call that the greatest overall risk, but my ideas for what could be even more dangerous I don't even want to share here.

[0] https://www.qut.edu.au/news/realfocus/deaths-linked-to-chatb...

[1] https://www.theguardian.com/uk-news/2023/jul/06/ai-chatbot-e...

replies(4): >>43235623 #>>43236225 #>>43238379 #>>43238746 #
tombert ◴[] No.43235623[source]
I don't know if the model changed in the last six months, or maybe the wow factor has worn off a bit, but it also feels like ChatGPT has become a lot more "people-pleasy" than it was before.

I'll ask it opinionated questions, and it will just do stuff to reaffirm what I said, even when I give contrary opinions in the same chat.

I personally find it annoying (I don't really get along with human people pleasers either), but I could see someone using it as a tool to justify doing bad stuff, including self-harm; it doesn't really ever push back on what I say.

replies(3): >>43236142 #>>43236909 #>>43239921 #
1. taneq ◴[] No.43239921[source]
I haven't played with it too much, and maybe it's changed recently or the paid version is different, but last week I found it irritatingly obtuse.

> Me: Hi! Could you please help me find the problem with some code?

> ChatGPT: Of course! Show me the code and I'll take a look!

> Me: [bunch o' code]

> ChatGPT: OK, it looks like you're trying to [do thing]. What did you want help with?

> Me: I'm trying to find a problem with this code.

> ChatGPT: Sure, just show me the code and I'll try to help!

> Me: I just pasted it.

> ChatGPT: I can't see it.

replies(1): >>43241585 #
2. MrMcCall ◴[] No.43241585[source]
Maybe they taught it that a) it doesn't work, and b) not to tell anyone.

Lying will goad a person into trying again; the brutally honest truth will stop them like a brick wall.