←back to thread

617 points jbegley | 8 comments | | HN request time: 0s | source | bottom
Show context
tmnvdb ◴[] No.42940589[source]
Good, this idea that all weapons are evil is an insane luxury belief.
replies(10): >>42940656 #>>42940666 #>>42940969 #>>42940977 #>>42941357 #>>42941474 #>>42941623 #>>42941755 #>>42941872 #>>42944147 #
ckrapu ◴[] No.42940656[source]
There is a wide range of moral and practical opinions between the statement “all weapons are evil” and “global corporations ought not to develop autonomous weapons”.
replies(3): >>42940711 #>>42940795 #>>42941137 #
cortesoft ◴[] No.42941137[source]
Who should develop autonomous weapons?
replies(2): >>42941302 #>>42942035 #
1. IIAOPSW ◴[] No.42941302{3}[source]
Who should develop biological weapons? Chemical weapons? Nuclear weapons?

Ideally no one, and if the cost / expertise is so niche that only a handful of sophisticated actors could possibly actually do it, then in fact (by way of enforceable treaty) no one.

replies(3): >>42941546 #>>42941549 #>>42947911 #
2. aydyn ◴[] No.42941546[source]
So in other words, cede military superiority to your enemies? Come on you already know the rational solution to prisoner's dilemma, MAD, etc.

> enforceable treaty

How would you enforce it after you get nuked?

replies(1): >>42941650 #
3. cakealert ◴[] No.42941549[source]
> Who should develop biological weapons? Chemical weapons? Nuclear weapons?

Anyone who wants to establish deterrence against superiors or peers, and open up options for handling weaker opponents.

> enforceable treaty

Such a thing does not exist. International affairs are and will always be in a state of anarchy. If at some point they aren't, then there is no "international" anymore.

4. lmm ◴[] No.42941650[source]
> in other words, cede military superiority to your enemies?

We're talking about making war slightly more expensive for yourself to preserve the things that matter, which is a trade-off that we make all the time. Even in war you don't have to race for the bottom for every marginal fraction-of-a-percent edge. We've managed to e.g. ban antipersonnel landmines, this is an extremely similar case.

> How would you enforce it after you get nuked?

And yet we've somehow managed to avoid getting into nuclear wars.

replies(3): >>42941717 #>>42943347 #>>42945807 #
5. pixl97 ◴[] No.42941717{3}[source]
Because after proliferation the cost would be too great and nukes other than wiping cities aren't that useful

AI on the other hand seems to be very multi purpose

6. Sabinus ◴[] No.42943347{3}[source]
Resusal to make or use AI-enabled weapons is not "making war slightly expensive for yourself", it's like giving up on the Manhattan project because the product is dangerous.

Feels good but will lead to disaster in the long run.

7. aydyn ◴[] No.42945807{3}[source]
> And yet we've somehow managed to avoid getting into nuclear wars.

Yes, through a massive programme of nuclear armament. In the case of AI, we should therefore...?

8. kelsey98765431 ◴[] No.42947911[source]
If we hadn't developed nuclear weapons we would still be burning coal and probably even closer to death from global warming. The answer here is government contractors should be developing the various type of weapon as they are, people just do not think of google as a government contractor for some reason.