Most active commenters
  • tmnvdb(7)
  • vkou(4)
  • CamperBob2(3)

←back to thread

617 points jbegley | 39 comments | | HN request time: 0.001s | source | bottom
Show context
tmnvdb ◴[] No.42940589[source]
Good, this idea that all weapons are evil is an insane luxury belief.
replies(10): >>42940656 #>>42940666 #>>42940969 #>>42940977 #>>42941357 #>>42941474 #>>42941623 #>>42941755 #>>42941872 #>>42944147 #
1. ckrapu ◴[] No.42940656[source]
There is a wide range of moral and practical opinions between the statement “all weapons are evil” and “global corporations ought not to develop autonomous weapons”.
replies(3): >>42940711 #>>42940795 #>>42941137 #
2. CamperBob2 ◴[] No.42940711[source]
Tell Putin. He will entertain no such inhibitions.
replies(2): >>42940920 #>>42941459 #
3. vasco ◴[] No.42940795[source]
Palantir exists, this would just be competition. It's not like Google is the only company capable of creating autonomous weapons so if they abstain the world is saved. They just want a piece of the pie. The problem is the pie comes with dead babies, but if you forget that part it's alright.
replies(2): >>42940917 #>>42940958 #
4. astrange ◴[] No.42940917[source]
Palantir doesn't make autonomous weapons, they sell SQL queries and have an evil-sounding name because it recruits juniors who think the name is cool.

Might be thinking of Anduril.

replies(2): >>42941219 #>>42941550 #
5. ignoramous ◴[] No.42940920[source]
> no such inhibitions

Propping up evil figure/regime/ideology (Bolsheviks/Communists) to justify remorseless evilness (Concentration camps/Nuclear bomb) isn't new nor unique, but particularly predictable.

replies(2): >>42941070 #>>42942016 #
6. tmnvdb ◴[] No.42940958[source]
With or without autonomous weapons, war is always a sordid business with 'dead babies', this is not in itself a fact that tells us what weapons systems to develop.
replies(1): >>42940992 #
7. darth_avocado ◴[] No.42940992{3}[source]
Yet there are boundaries on which weapons we can and cannot develop: Nuclear, Chemical, Biological etc.
replies(2): >>42941178 #>>42942113 #
8. CamperBob2 ◴[] No.42941070{3}[source]
Sadly, attempts at equating evil figures/regimes/ideologies with those who fight back against them are equally predictable.
9. cortesoft ◴[] No.42941137[source]
Who should develop autonomous weapons?
replies(2): >>42941302 #>>42942035 #
10. tmnvdb ◴[] No.42941178{4}[source]
Indeed. Usually weapons are banned if the damage is high and indiscriminate while the military usefulness is low.

There is at this moment little evidence that autonomous weapons will cause more collateral damage than artillery shells and regular air strikes. The military usefulness on other other hand seems to be very high and increasing.

replies(2): >>42941661 #>>42946405 #
11. trhway ◴[] No.42941219{3}[source]
Palantir provides combat management system in Ukraine. That system collect and analyzes intelligence, including drone video streams, and identifies targets. Right now people are still in the loop though that is naturally would go away in the near future I think.
12. IIAOPSW ◴[] No.42941302[source]
Who should develop biological weapons? Chemical weapons? Nuclear weapons?

Ideally no one, and if the cost / expertise is so niche that only a handful of sophisticated actors could possibly actually do it, then in fact (by way of enforceable treaty) no one.

replies(3): >>42941546 #>>42941549 #>>42947911 #
13. vkou ◴[] No.42941459[source]
We have Putin at home, he spent the past weekend making populist noises about annexing his neighbours over bullshit pretenses.

I'm sure this sounds like a big nothingburger from the perspective of, you know, people he isn't threatening.

How can you excuse that behaviour? How can you think someone like that can be trusted with any weapons? How naive and morally bankrupt do you have to be to build a gun for that kind of person, and think that it won't be used irresponsibly?

replies(2): >>42941589 #>>42942150 #
14. aydyn ◴[] No.42941546{3}[source]
So in other words, cede military superiority to your enemies? Come on you already know the rational solution to prisoner's dilemma, MAD, etc.

> enforceable treaty

How would you enforce it after you get nuked?

replies(1): >>42941650 #
15. cakealert ◴[] No.42941549{3}[source]
> Who should develop biological weapons? Chemical weapons? Nuclear weapons?

Anyone who wants to establish deterrence against superiors or peers, and open up options for handling weaker opponents.

> enforceable treaty

Such a thing does not exist. International affairs are and will always be in a state of anarchy. If at some point they aren't, then there is no "international" anymore.

16. cookiengineer ◴[] No.42941550{3}[source]
Palantir literally developed Lavender that has been used for autonomous targeting in the bombardments of the Gaza stripe.

Look it up.

replies(1): >>42941846 #
17. tmnvdb ◴[] No.42941589{3}[source]
I understand the sentiment but the logical conclusion of that argument is that the US should disarm and cease existing.
replies(1): >>42941627 #
18. vkou ◴[] No.42941627{4}[source]
The better logical conclusion of that argument is that the US needs to remove him, and replace him with someone who isn't threatening innocent people.

That it won't is a mixture of cowardice, cynical opportunism, and complicity with unprovoked aggression.

In which case, I posit that yes, if you're fine with threatening or inflicting violence on innocent people, you don't have a moral right to 'self-defense'. It makes you a predator, and arming a predator is a mistake.

You lose any moral ground you have when you are an unprovoked aggressor.

replies(2): >>42941749 #>>42942120 #
19. lmm ◴[] No.42941650{4}[source]
> in other words, cede military superiority to your enemies?

We're talking about making war slightly more expensive for yourself to preserve the things that matter, which is a trade-off that we make all the time. Even in war you don't have to race for the bottom for every marginal fraction-of-a-percent edge. We've managed to e.g. ban antipersonnel landmines, this is an extremely similar case.

> How would you enforce it after you get nuked?

And yet we've somehow managed to avoid getting into nuclear wars.

replies(3): >>42941717 #>>42943347 #>>42945807 #
20. bluefirebrand ◴[] No.42941661{5}[source]
It seems like the sort of thing we shouldn't be wanting evidence of in order to avoid, though

Like skydiving without a parachute, I think we should accept it is a bad idea without needing a double blind study

replies(3): >>42942086 #>>42942554 #>>42945249 #
21. pixl97 ◴[] No.42941717{5}[source]
Because after proliferation the cost would be too great and nukes other than wiping cities aren't that useful

AI on the other hand seems to be very multi purpose

22. pixl97 ◴[] No.42941749{5}[source]
Ya go poke people with nukes and see how that works out
replies(1): >>42941969 #
23. pbiggar ◴[] No.42941846{4}[source]
https://www.972mag.com/lavender-ai-israeli-army-gaza/
24. vkou ◴[] No.42941969{6}[source]
You are making an excellent argument for nuclear proliferation.
25. gosub100 ◴[] No.42942016{3}[source]
nukes saved countless US lives being lost to a regime who brought us into it. And it's incalculable how many wars they have prevented.
26. ◴[] No.42942035[source]
27. tmnvdb ◴[] No.42942086{6}[source]
The risks needs to be weighed against the downside of not deploying a capable system against your enemies.
28. _bin_ ◴[] No.42942113{4}[source]
those are mostly drawn on how difficult it is to manage their effects. chemical weapons are hard to target, nukes are too (unless one dials the yield down enough that there's little point) and make land unusable for years, and biological weapons can't really be contained to military targets.

we have, of course, developed all three. they have gone a long way towards keeping us safe over the past century.

29. tmnvdb ◴[] No.42942120{5}[source]
I'm not a fan of Trump but I also feel he has not been so bad that I think that surrendering the world order to Russia and China is a rational action that minimizes suffering. That seems be an argument that is more about signalling that you really dislike Trump than about a rational consideration of all options available to us.
replies(2): >>42942192 #>>42945768 #
30. vkou ◴[] No.42942192{6}[source]
It's not a shallow, dismissable, just-your-opinion-maaan 'dislike' to observe that he is being an aggressor. Just like it's not a 'dislike' to observe that Putin is being one.

There are more options than arming an aggressor and capitulating to foreign powers. It's a false dichotomy to suggest it.

31. int_19h ◴[] No.42942554{6}[source]
It's a bit too late for that, since Ukraine and Russia are both already using AI-controlled drones in combat.
32. Sabinus ◴[] No.42943347{5}[source]
Resusal to make or use AI-enabled weapons is not "making war slightly expensive for yourself", it's like giving up on the Manhattan project because the product is dangerous.

Feels good but will lead to disaster in the long run.

33. CamperBob2 ◴[] No.42943613{4}[source]
TBF, vkou's post disagrees with mine, but I don't disagree with it. If pressed to offer a forecast, I think the moral dilemmas we're about to face as Americans will be both disturbing and intimidating, with a 50% chance of horrifying.
34. vasco ◴[] No.42945249{6}[source]
Not all is bad, it's preferable to have autonomous systems killing each other than killing humans. If it gets very prevalent you could even get to a point where war is just simulated war games. Why have an AI piloted F-35 fight a AI piloted J-36? Just do it on the computer. It's at least 1 or 2 less pilots that die in that case.
35. kombine ◴[] No.42945768{6}[source]
> I'm not a fan of Trump but I also feel he has not been so bad

He literally threatened a peaceful nation (also an ally) of invasion and annexation. How worse can it get?

replies(1): >>42964650 #
36. aydyn ◴[] No.42945807{5}[source]
> And yet we've somehow managed to avoid getting into nuclear wars.

Yes, through a massive programme of nuclear armament. In the case of AI, we should therefore...?

37. ◴[] No.42946405{5}[source]
38. kelsey98765431 ◴[] No.42947911{3}[source]
If we hadn't developed nuclear weapons we would still be burning coal and probably even closer to death from global warming. The answer here is government contractors should be developing the various type of weapon as they are, people just do not think of google as a government contractor for some reason.
39. tmnvdb ◴[] No.42964650{7}[source]
If he actually did it that would be far worse.