←back to thread

617 points jbegley | 9 comments | | HN request time: 0s | source | bottom
Show context
tmnvdb ◴[] No.42940589[source]
Good, this idea that all weapons are evil is an insane luxury belief.
replies(10): >>42940656 #>>42940666 #>>42940969 #>>42940977 #>>42941357 #>>42941474 #>>42941623 #>>42941755 #>>42941872 #>>42944147 #
ckrapu ◴[] No.42940656[source]
There is a wide range of moral and practical opinions between the statement “all weapons are evil” and “global corporations ought not to develop autonomous weapons”.
replies(3): >>42940711 #>>42940795 #>>42941137 #
vasco ◴[] No.42940795[source]
Palantir exists, this would just be competition. It's not like Google is the only company capable of creating autonomous weapons so if they abstain the world is saved. They just want a piece of the pie. The problem is the pie comes with dead babies, but if you forget that part it's alright.
replies(2): >>42940917 #>>42940958 #
1. tmnvdb ◴[] No.42940958{3}[source]
With or without autonomous weapons, war is always a sordid business with 'dead babies', this is not in itself a fact that tells us what weapons systems to develop.
replies(1): >>42940992 #
2. darth_avocado ◴[] No.42940992[source]
Yet there are boundaries on which weapons we can and cannot develop: Nuclear, Chemical, Biological etc.
replies(2): >>42941178 #>>42942113 #
3. tmnvdb ◴[] No.42941178[source]
Indeed. Usually weapons are banned if the damage is high and indiscriminate while the military usefulness is low.

There is at this moment little evidence that autonomous weapons will cause more collateral damage than artillery shells and regular air strikes. The military usefulness on other other hand seems to be very high and increasing.

replies(2): >>42941661 #>>42946405 #
4. bluefirebrand ◴[] No.42941661{3}[source]
It seems like the sort of thing we shouldn't be wanting evidence of in order to avoid, though

Like skydiving without a parachute, I think we should accept it is a bad idea without needing a double blind study

replies(3): >>42942086 #>>42942554 #>>42945249 #
5. tmnvdb ◴[] No.42942086{4}[source]
The risks needs to be weighed against the downside of not deploying a capable system against your enemies.
6. _bin_ ◴[] No.42942113[source]
those are mostly drawn on how difficult it is to manage their effects. chemical weapons are hard to target, nukes are too (unless one dials the yield down enough that there's little point) and make land unusable for years, and biological weapons can't really be contained to military targets.

we have, of course, developed all three. they have gone a long way towards keeping us safe over the past century.

7. int_19h ◴[] No.42942554{4}[source]
It's a bit too late for that, since Ukraine and Russia are both already using AI-controlled drones in combat.
8. vasco ◴[] No.42945249{4}[source]
Not all is bad, it's preferable to have autonomous systems killing each other than killing humans. If it gets very prevalent you could even get to a point where war is just simulated war games. Why have an AI piloted F-35 fight a AI piloted J-36? Just do it on the computer. It's at least 1 or 2 less pilots that die in that case.
9. ◴[] No.42946405{3}[source]