Most active commenters
  • tmnvdb(8)
  • astrange(4)
  • vkou(4)
  • CamperBob2(3)
  • ignoramous(3)
  • pixl97(3)

←back to thread

617 points jbegley | 76 comments | | HN request time: 1.843s | source | bottom
1. tmnvdb ◴[] No.42940589[source]
Good, this idea that all weapons are evil is an insane luxury belief.
replies(10): >>42940656 #>>42940666 #>>42940969 #>>42940977 #>>42941357 #>>42941474 #>>42941623 #>>42941755 #>>42941872 #>>42944147 #
2. ckrapu ◴[] No.42940656[source]
There is a wide range of moral and practical opinions between the statement “all weapons are evil” and “global corporations ought not to develop autonomous weapons”.
replies(3): >>42940711 #>>42940795 #>>42941137 #
3. ziddoap ◴[] No.42940666[source]
>all weapons are evil

That wasn't the quote that was removed. Not even close, really.

replies(1): >>42940904 #
4. CamperBob2 ◴[] No.42940711[source]
Tell Putin. He will entertain no such inhibitions.
replies(2): >>42940920 #>>42941459 #
5. vasco ◴[] No.42940795[source]
Palantir exists, this would just be competition. It's not like Google is the only company capable of creating autonomous weapons so if they abstain the world is saved. They just want a piece of the pie. The problem is the pie comes with dead babies, but if you forget that part it's alright.
replies(2): >>42940917 #>>42940958 #
6. astrange ◴[] No.42940904[source]
It's definitely an opinion Google employees had in the last decade.

Actually I think a lot of people have it - just yesterday I saw someone on reddit claim Google was evil because it was secretly founded by the US military. And they were American. That's their military!

replies(2): >>42941091 #>>42942051 #
7. astrange ◴[] No.42940917{3}[source]
Palantir doesn't make autonomous weapons, they sell SQL queries and have an evil-sounding name because it recruits juniors who think the name is cool.

Might be thinking of Anduril.

replies(2): >>42941219 #>>42941550 #
8. ignoramous ◴[] No.42940920{3}[source]
> no such inhibitions

Propping up evil figure/regime/ideology (Bolsheviks/Communists) to justify remorseless evilness (Concentration camps/Nuclear bomb) isn't new nor unique, but particularly predictable.

replies(2): >>42941070 #>>42942016 #
9. tmnvdb ◴[] No.42940958{3}[source]
With or without autonomous weapons, war is always a sordid business with 'dead babies', this is not in itself a fact that tells us what weapons systems to develop.
replies(1): >>42940992 #
10. ignoramous ◴[] No.42940969[source]
> all weapons are evil is an insane luxury belief

It isn't this that's insane, but a total belief purity of weapons that is.

replies(1): >>42941205 #
11. darth_avocado ◴[] No.42940977[source]
Weapons inherently aren’t evil, which is why everyone has kitchen knives. People use weapons to do evil.

The problem with building AI weapons is that eventually it will be in the hands of people who are morally bankrupt and therefore will use them to do evil.

replies(7): >>42941036 #>>42941185 #>>42941187 #>>42941379 #>>42941414 #>>42941732 #>>42942559 #
12. darth_avocado ◴[] No.42940992{4}[source]
Yet there are boundaries on which weapons we can and cannot develop: Nuclear, Chemical, Biological etc.
replies(2): >>42941178 #>>42942113 #
13. Dalewyn ◴[] No.42941036[source]
Much as it is the case with guns, why is the "problem" the tools or provider of the tools and not the user of the tools?
replies(1): >>42941763 #
14. CamperBob2 ◴[] No.42941070{4}[source]
Sadly, attempts at equating evil figures/regimes/ideologies with those who fight back against them are equally predictable.
15. jjj123 ◴[] No.42941091{3}[source]
It’s my military too and I believe the US military does many, many evil things that I want no part of.
replies(2): >>42941331 #>>42942082 #
16. cortesoft ◴[] No.42941137[source]
Who should develop autonomous weapons?
replies(2): >>42941302 #>>42942035 #
17. tmnvdb ◴[] No.42941178{5}[source]
Indeed. Usually weapons are banned if the damage is high and indiscriminate while the military usefulness is low.

There is at this moment little evidence that autonomous weapons will cause more collateral damage than artillery shells and regular air strikes. The military usefulness on other other hand seems to be very high and increasing.

replies(2): >>42941661 #>>42946405 #
18. pyinstallwoes ◴[] No.42941185[source]
“…so which is it then? Is it really robots that are wired to kill people, or the humans wiring them?”
19. psunavy03 ◴[] No.42941187[source]
Then we should be encouraging their development by the governments of liberal democratic nations as opposed to authoritarian regimes.
replies(1): >>42941612 #
20. psunavy03 ◴[] No.42941205[source]
You don't have to have a "total believe in the purity of weapons" to recognize that military tech is a regrettable but necessary thing for a nation to pursue.
replies(1): >>42943230 #
21. trhway ◴[] No.42941219{4}[source]
Palantir provides combat management system in Ukraine. That system collect and analyzes intelligence, including drone video streams, and identifies targets. Right now people are still in the loop though that is naturally would go away in the near future I think.
22. IIAOPSW ◴[] No.42941302{3}[source]
Who should develop biological weapons? Chemical weapons? Nuclear weapons?

Ideally no one, and if the cost / expertise is so niche that only a handful of sophisticated actors could possibly actually do it, then in fact (by way of enforceable treaty) no one.

replies(3): >>42941546 #>>42941549 #>>42947911 #
23. dark_glass ◴[] No.42941331{4}[source]
"We sleep safely at night because rough men stand ready to visit violence on those who would harm us"
replies(1): >>42941594 #
24. aprilthird2021 ◴[] No.42941357[source]
It's a luxury belief to think you won't one day be scanned by an AI to determine if you're killable or not
25. gerdesj ◴[] No.42941379[source]
Who is to say a wielder of a kitchen knife is not "morally bankrupt" - whatever that means.

In my garage, I have some pretty nasty "weapons" - notably a couple of chainsaws, some drills, chisels, lump/sledge/etc hammers and a fencing maul! The rest are merely: mildly malevolent.

You don't need an AI (whatever that means) to get medieval on someone. On the bright side the current state of AI (whatever that means) is largely bollocks.

Sadly, LLMs have and will be wired up to drones and the results will be unpredictable.

26. leptons ◴[] No.42941414[source]
A kitchen knife is a tool. It can be used as a weapon.

A car is a tool. It can be used as a weapon.

Even water and air can be used as a weapon if you try hard enough. There is probably nothing on this planet that couldn't be used as a weapon.

That said, I do not think AI weapons are a reasonable thing to build for any war, for any country, for any reason - even if the enemy has them.

replies(1): >>42941923 #
27. vkou ◴[] No.42941459{3}[source]
We have Putin at home, he spent the past weekend making populist noises about annexing his neighbours over bullshit pretenses.

I'm sure this sounds like a big nothingburger from the perspective of, you know, people he isn't threatening.

How can you excuse that behaviour? How can you think someone like that can be trusted with any weapons? How naive and morally bankrupt do you have to be to build a gun for that kind of person, and think that it won't be used irresponsibly?

replies(2): >>42941589 #>>42942150 #
28. bbqfog ◴[] No.42941474[source]
The US is not under any kind of credible threat and in fact is the aggressor across the globe and perpetrator of crimes against humanity at scale. This is not a recent phenomena and has been going on as long as this country has existed.
replies(1): >>42942299 #
29. aydyn ◴[] No.42941546{4}[source]
So in other words, cede military superiority to your enemies? Come on you already know the rational solution to prisoner's dilemma, MAD, etc.

> enforceable treaty

How would you enforce it after you get nuked?

replies(1): >>42941650 #
30. cakealert ◴[] No.42941549{4}[source]
> Who should develop biological weapons? Chemical weapons? Nuclear weapons?

Anyone who wants to establish deterrence against superiors or peers, and open up options for handling weaker opponents.

> enforceable treaty

Such a thing does not exist. International affairs are and will always be in a state of anarchy. If at some point they aren't, then there is no "international" anymore.

31. cookiengineer ◴[] No.42941550{4}[source]
Palantir literally developed Lavender that has been used for autonomous targeting in the bombardments of the Gaza stripe.

Look it up.

replies(1): >>42941846 #
32. tmnvdb ◴[] No.42941589{4}[source]
I understand the sentiment but the logical conclusion of that argument is that the US should disarm and cease existing.
replies(1): >>42941627 #
33. switchbak ◴[] No.42941594{5}[source]
And these same organizations fuel conflicts that actively make the USA less safe. These organizations can both do great things (hostage rescues) and terrible things (initiating coups), and it’s upon the citizenry to ensure that these forces are put to use only where justified. That is to say almost never.
replies(1): >>42941901 #
34. burningChrome ◴[] No.42941612{3}[source]
Serious question.

How would we go about doing that?

Every kind of nefarious way to keep the truth at bay in authoritarian regimes is always on the table. From the cracking of iPhones to track journalists covering these regimes, to snooping on email, to using AI to do this? Is just all the same thing, just updated and improved tools.

Just like Kevin Mitnick selling zero day exploits to the highest bidder, I have a hard time seeing how these get developed and somehow stay out of reach of the regimes you speak of.

35. captainbland ◴[] No.42941623[source]
Whatever your feelings on that are, it's hardly unreasonable to have misgivings about your search and YouTube watches going to fund sloppy AI weapons programmes that probably won't even kill the right people.
36. vkou ◴[] No.42941627{5}[source]
The better logical conclusion of that argument is that the US needs to remove him, and replace him with someone who isn't threatening innocent people.

That it won't is a mixture of cowardice, cynical opportunism, and complicity with unprovoked aggression.

In which case, I posit that yes, if you're fine with threatening or inflicting violence on innocent people, you don't have a moral right to 'self-defense'. It makes you a predator, and arming a predator is a mistake.

You lose any moral ground you have when you are an unprovoked aggressor.

replies(2): >>42941749 #>>42942120 #
37. lmm ◴[] No.42941650{5}[source]
> in other words, cede military superiority to your enemies?

We're talking about making war slightly more expensive for yourself to preserve the things that matter, which is a trade-off that we make all the time. Even in war you don't have to race for the bottom for every marginal fraction-of-a-percent edge. We've managed to e.g. ban antipersonnel landmines, this is an extremely similar case.

> How would you enforce it after you get nuked?

And yet we've somehow managed to avoid getting into nuclear wars.

replies(3): >>42941717 #>>42943347 #>>42945807 #
38. bluefirebrand ◴[] No.42941661{6}[source]
It seems like the sort of thing we shouldn't be wanting evidence of in order to avoid, though

Like skydiving without a parachute, I think we should accept it is a bad idea without needing a double blind study

replies(3): >>42942086 #>>42942554 #>>42945249 #
39. pixl97 ◴[] No.42941717{6}[source]
Because after proliferation the cost would be too great and nukes other than wiping cities aren't that useful

AI on the other hand seems to be very multi purpose

40. osmsucks ◴[] No.42941732[source]
The difference there is that a knife has some obvious, benign use cases. Smart weapons targeting has only one use case, and it's to do harm to others.
replies(1): >>42942004 #
41. pixl97 ◴[] No.42941749{6}[source]
Ya go poke people with nukes and see how that works out
replies(1): >>42941969 #
42. siltcakes ◴[] No.42941755[source]
Do you see nothing wrong with the same company that makes YouTube Kids making killer AI? I think creating weapons is often evil. I think companies that have consumer brands should never make weapons, at the very least it's white washing what's really going on. At worst, they can leverage their media properties for propaganda purposes, spy on your Gmail and Maps usage and act as a vector for the most nefarious cyber terrorism imaginable.
replies(2): >>42942013 #>>42942068 #
43. pixl97 ◴[] No.42941763{3}[source]
Depends if next years gun gets up and shoots you in the head on its own accord.
44. pbiggar ◴[] No.42941846{5}[source]
https://www.972mag.com/lavender-ai-israeli-army-gaza/
45. sangnoir ◴[] No.42941872[source]
It's not a luxury belief for a multinational tech company that intends to remain in business in countries that are not allied to the US. Being seen as independent of the military has a dollar value, but that may be smaller than value of defense contracts Google hopes to get.
46. astrange ◴[] No.42941901{6}[source]
We've stopped South American coups more recently than we've initiated them. (in the last few years, in Brazil and Bolivia)
replies(1): >>42955052 #
47. gizmondo ◴[] No.42941923{3}[source]
> That said, I do not think AI weapons are a reasonable thing to build for any war, for any country, for any reason - even if the enemy has them.

So you're in favor of losing a war and becoming a subject of the enemy? While it's certainly tempting to think that unilateralism can work, I can hardly see how.

replies(1): >>42943880 #
48. vkou ◴[] No.42941969{7}[source]
You are making an excellent argument for nuclear proliferation.
49. xdennis ◴[] No.42942004{3}[source]
AI weapons do have benign use cases: harming enemies.

When China attacks with AI weapons do you expect the free world to fight back armed with moral superiority? No. We need even more lethal AI weapons.

Mutual assured destruction has worked so far for nukes.

replies(1): >>42945114 #
50. greenavocado ◴[] No.42942013[source]
The same company that brings you cute cartoons for kids might also develop technologies with military applications, but that doesn't make them inherently "evil." It just makes them a microcosm of humanity's duality: the same species that created the Mona Lisa also invented napalm.

Should companies with consumer brands never make weapons? Sure, and while we're at it, let's ban knives because they can be used for both chopping vegetables and stabbing people. The issue isn't the technology itself. It's how it's regulated, controlled, and used. And as for cyber terrorism? That's a problem with bad actors, not with the tools themselves.

So, by all means, keep pointing out the hypocrisy of a company that makes YouTube Kids and killer AI. Just don't pretend like you're not benefiting from the same duality every time you use a smartphone or the internet which don't forget is a technology born, ironically, from military research.

51. gosub100 ◴[] No.42942016{4}[source]
nukes saved countless US lives being lost to a regime who brought us into it. And it's incalculable how many wars they have prevented.
52. ◴[] No.42942035{3}[source]
53. gosub100 ◴[] No.42942051{3}[source]
they have no problems heavily censoring law-abiding gun youtubers. Even changing the rules and giving them strikes retroactively. I guess it's "weapons for me, but not for thee".
54. jcgrillo ◴[] No.42942068[source]
It sounds like they're distracted, tbh. It's hard to imagine how a company that specializes in getting children addicted to unboxing videos can possibly be good at killing people.. oh, wait, maybe not after all..
55. astrange ◴[] No.42942082{4}[source]
I think the thing to remember is, however bad it is, it could always get worse.

A world without the US navy is one without sea shipping because pirates will come back.

56. tmnvdb ◴[] No.42942086{7}[source]
The risks needs to be weighed against the downside of not deploying a capable system against your enemies.
57. _bin_ ◴[] No.42942113{5}[source]
those are mostly drawn on how difficult it is to manage their effects. chemical weapons are hard to target, nukes are too (unless one dials the yield down enough that there's little point) and make land unusable for years, and biological weapons can't really be contained to military targets.

we have, of course, developed all three. they have gone a long way towards keeping us safe over the past century.

58. tmnvdb ◴[] No.42942120{6}[source]
I'm not a fan of Trump but I also feel he has not been so bad that I think that surrendering the world order to Russia and China is a rational action that minimizes suffering. That seems be an argument that is more about signalling that you really dislike Trump than about a rational consideration of all options available to us.
replies(2): >>42942192 #>>42945768 #
59. vkou ◴[] No.42942192{7}[source]
It's not a shallow, dismissable, just-your-opinion-maaan 'dislike' to observe that he is being an aggressor. Just like it's not a 'dislike' to observe that Putin is being one.

There are more options than arming an aggressor and capitulating to foreign powers. It's a false dichotomy to suggest it.

60. tmnvdb ◴[] No.42942299[source]
The US mainland is not currently under threat but the US world system is.
replies(1): >>42949199 #
61. int_19h ◴[] No.42942554{7}[source]
It's a bit too late for that, since Ukraine and Russia are both already using AI-controlled drones in combat.
62. int_19h ◴[] No.42942559[source]
That's the problem with all weapons.

The concern with AI weapons specifically is that if something goes wrong, they might not even be in the hands of the people at all, but pursue their own objective.

63. ignoramous ◴[] No.42943230{3}[source]
> You don't have to have "total belief in the purity of weapons"...

Of course. My point was, it is insane for those who do.

64. Sabinus ◴[] No.42943347{6}[source]
Resusal to make or use AI-enabled weapons is not "making war slightly expensive for yourself", it's like giving up on the Manhattan project because the product is dangerous.

Feels good but will lead to disaster in the long run.

65. CamperBob2 ◴[] No.42943613{5}[source]
TBF, vkou's post disagrees with mine, but I don't disagree with it. If pressed to offer a forecast, I think the moral dilemmas we're about to face as Americans will be both disturbing and intimidating, with a 50% chance of horrifying.
66. leptons ◴[] No.42943880{4}[source]
>So you're in favor of losing a war and becoming a subject of the enemy?

I never said that. Please don't reply to comments you made up in your head.

Using AI doesn't automagically equate to winning a war. Using AI could mean the AI kills all your own soldiers by mistake. AI is stupid, it just is. It "hallucinates" and often leads to wrong outcomes. And it has never won a war, and there's no guarantee that it would help to win any war.

67. PessimalDecimal ◴[] No.42944147[source]
You're either misdirecting the discussion, or have missed the point. The statement isn't about weapons, but the means of _control_ of weapons.

It's legitimate to worry about scaled, automated control of weapons, since it could allow a very small number of people to harm a much larger number of people. That removes one of our best checks we have against the misuse of weaponry. If you have to muster a whole army to go kill a bunch of people, they can collectively revolt. (It's not always _easy_ but it's possible.)

Automating weapons is a lot like nuclear weapons in some ways. Once the hard parts are done (refining raw oar), the ability for a small number of people to harm a vast number of others is serious. People are right to worry about it.

68. osmsucks ◴[] No.42945114{4}[source]
Use of weapons is only benign to you if you're not on the receiving end. Imagine your family being blown up by a rocket because an AI system hallucinated that they're part of a dangerous terror cell.

My point though is that this is the only use case for such systems. The common comparisons to things like knives are invalid for this reason.

69. vasco ◴[] No.42945249{7}[source]
Not all is bad, it's preferable to have autonomous systems killing each other than killing humans. If it gets very prevalent you could even get to a point where war is just simulated war games. Why have an AI piloted F-35 fight a AI piloted J-36? Just do it on the computer. It's at least 1 or 2 less pilots that die in that case.
70. kombine ◴[] No.42945768{7}[source]
> I'm not a fan of Trump but I also feel he has not been so bad

He literally threatened a peaceful nation (also an ally) of invasion and annexation. How worse can it get?

replies(1): >>42964650 #
71. aydyn ◴[] No.42945807{6}[source]
> And yet we've somehow managed to avoid getting into nuclear wars.

Yes, through a massive programme of nuclear armament. In the case of AI, we should therefore...?

72. ◴[] No.42946405{6}[source]
73. kelsey98765431 ◴[] No.42947911{4}[source]
If we hadn't developed nuclear weapons we would still be burning coal and probably even closer to death from global warming. The answer here is government contractors should be developing the various type of weapon as they are, people just do not think of google as a government contractor for some reason.
74. bbqfog ◴[] No.42949199{3}[source]
That's absolutely no reason to attack anyone.
75. switchbak ◴[] No.42955052{7}[source]
Truly, the last few minutes of American history is not material to the argument I’m making here.

And I don’t doubt there’s still a lot of subterfuge happening as we speak, most of which we’ll never hear about until something goes very wrong.

76. tmnvdb ◴[] No.42964650{8}[source]
If he actually did it that would be far worse.