←back to thread

435 points crawshaw | 1 comments | | HN request time: 0.202s | source
Show context
outworlder ◴[] No.44000143[source]
> If you don't have some tool installed, it'll install it.

Terrifying. LLMs are very 'accommodating' and all they need is someone asking them to do something. This is like SQL injection, but worse.

replies(2): >>44001194 #>>44006744 #
1. nxobject ◴[] No.44006744[source]
In an ideal world, this would require us as programmers to lean into our codebase reading and augmentation skills – currently underappreciated anyway as a skill to build. But when the incentives lean towards write-only code, I'm not optimistic.