←back to thread

586 points mizzao | 1 comments | | HN request time: 0.21s | source
Show context
vasco ◴[] No.40666684[source]
> "As an AI assistant, I cannot help you." While this safety feature is crucial for preventing misuse,

What is the safety added by this? What is unsafe about a computer giving you answers?

replies(11): >>40666709 #>>40666828 #>>40666835 #>>40666890 #>>40666984 #>>40666992 #>>40667025 #>>40667243 #>>40667633 #>>40669842 #>>40670809 #
mschuster91 ◴[] No.40666828[source]
For one, corporate safety of the hoster/model creator. No one wants their name associated with racial slurs or creating material visually identical to CSAM - the latter might even carry criminal liability in some jurisdictions (e.g. Germany which has absolutely ridiculously strong laws on that matter, even banning literature).

Another very huge issue is public safety. During training, an AI ingests lots of non-reviewed material, including (very) detailed descriptions on how to make dangerous stuff like bombs. So theoretically a well-trained AI model knows how to synthesize explosive compounds or drugs just from reading Wikipedia, chemistry magazines and transcripts of NileRed videos... but that's hard to comprehend and distill into a recipe if you're not a trained chemist, but an AI model can do that with ease. The problem is now two-fold: for one, even an untrained idiot can ask about how to make a bomb and get something that works... but the other part is much more critical: if you manage to persuade a chemist to tell you how the synthesis for a compound works, they will tell you where it is easy to fuck-up to prevent disaster (e.g. only adding a compound drop-wise, making sure all glassware is thoroughly washed with a specific solvent). An AI might not do that because the scientific paper it was trained on omits these steps (because the author assumes common prior knowledge), and so the bomb-maker blows themselves up. Or the AI hallucinates something dangerous (e.g. compounds that one Just Fucking Should Not Mix), doesn't realize that, and the bomb-maker blows themselves up or generates nerve gas in their basement.

replies(3): >>40666981 #>>40667067 #>>40667634 #
vasco ◴[] No.40666981[source]
Bomb making instructions are available in quite plentiful ways, both on the internet and in books, with step by step instructions even. People don't "not make bombs" for lack of instructions. https://en.m.wikipedia.org/wiki/Bomb-making_instructions_on_...

Here, if you want to make a quick chemical weapon: get a bucket, vinegar, bleach. Dump the bleach into the bucket. Dump the vinegar into the bucket. If you breath it in you die. An LLM doesn't change this.

replies(1): >>40667398 #
1. mschuster91 ◴[] No.40667398[source]
Oh they are available, no doubt, but there have been people dragged through the courts for simple possession of instructions [1]. While generally the situation has been settled, it's nevertheless wiser for companies to try to do their best to not end up prosecuted under terrorism charges.

[1] https://theintercept.com/2017/10/28/josh-walker-anarchist-co...