←back to thread

225 points martinald | 2 comments | | HN request time: 0.755s | source
Show context
ryao ◴[] No.44538755[source]
Am I the only one who thinks mention of “safety tests” for LLMs is a marketing scheme? Cars, planes and elevators have safety tests. LLMs don’t. Nobody is going to die if a LLM gives an output that its creators do not like, yet when they say “safety tests”, they mean that they are checking to what extent the LLM will say things they do not like.
replies(12): >>44538785 #>>44538805 #>>44538808 #>>44538903 #>>44538929 #>>44539030 #>>44539924 #>>44540225 #>>44540905 #>>44542283 #>>44542952 #>>44543574 #
natrius ◴[] No.44538808[source]
An LLM can trivially instruct someone to take medications with adverse interactions, steer a mental health crisis toward suicide, or make a compelling case that a particular ethnic group is the cause of your society's biggest problem so they should be eliminated. Words can't kill people, but words can definitely lead to deaths.

That's not even considering tool use!

replies(12): >>44538847 #>>44538877 #>>44538896 #>>44538914 #>>44539109 #>>44539685 #>>44539785 #>>44539805 #>>44540111 #>>44542360 #>>44542401 #>>44542586 #
selfhoster11 ◴[] No.44539785[source]
Yes, and a table saw can take your hand. As can a whole variety of power tools. That does not render them illegal to sell to adults.
replies(3): >>44540109 #>>44540134 #>>44543174 #
vntok ◴[] No.44540134[source]
An interesting comparison.

Table saws sold all over the world are inspected and certified by trusted third parties to ensure they operate safely. They are illegal to sell without the approval seal.

Moreover, table saws sold in the United States & EU (at least) have at least 3 safety features (riving knife, blade guard, antikickback device) designed to prevent personal injury while operating the machine. They are illegal to sell without these features.

Then of course there are additional devices like sawstop, but it is not mandatory yet as far as I'm aware. Should be in a few years though.

LLMs have none of those board labels or safety features, so I'm not sure what your point was exactly?

replies(2): >>44540355 #>>44542429 #
1. andsoitis ◴[] No.44542429[source]
An LLM is not gonna chop of your limb. You can’t use it to attack someone.
replies(1): >>44544104 #
2. vntok ◴[] No.44544104[source]
An LLM is gonna convince you to treat your wound with quack medics instead of seeing a doctor, which will eventually result the limb being chopped to save you from gangrene.

You can perfectly use an LLM to attack someone. Your sentence is very weird as it comes off as a denial of things that have been happening for months and are ramping up. Examples abound: generate scam letters, find security flaws in a codebase, extract personal information from publicly-available-yet-not-previously-known locations, generate attack software customized for particular targets, generate untraceable hit offers and then post them on anonymized Internet services on your behalf, etc. etc.