←back to thread

447 points crawshaw | 1 comments | | HN request time: 0.415s | source
Show context
outworlder ◴[] No.44000143[source]
> If you don't have some tool installed, it'll install it.

Terrifying. LLMs are very 'accommodating' and all they need is someone asking them to do something. This is like SQL injection, but worse.

replies(2): >>44001194 #>>44006744 #
1. rglover ◴[] No.44001194[source]
I often wonder what the first agent-driven catastrophe will be. Considering the gold rush (emphasis on rush) going on, it's only a matter of time before a difficult to fix disaster occurs.