←back to thread

617 points jbegley | 7 comments | | HN request time: 0.211s | source | bottom
Show context
uejfiweun ◴[] No.42940754[source]
Dollar short and a day late. The future of the US tech industry belongs to those who weren't interested in performative woke nonsense like this during the last decade.
replies(4): >>42941043 #>>42941376 #>>42941529 #>>42941636 #
aprilthird2021 ◴[] No.42941376[source]
How is it woke nonsense to not want to create a weapon that probabilistically determines if a civilian looks close enough like a bad guy to missle strike them?
replies(3): >>42941543 #>>42941760 #>>42942262 #
1. tmnvdb ◴[] No.42941760[source]
This sentiment ignores the reality on the ground in favor of performative ideological purity - civilians are already getting blown up all the time by systems that do not even attempt to make any distinction between civilians and soldiers: artillery shells, mortars, landmines, rockets, etc.
replies(3): >>42941816 #>>42941874 #>>42944581 #
2. pixl97 ◴[] No.42941816[source]
And indiscriminate has a cost that can slow people from using them.

Imagine you have a weapon that can find and kill all the 'bad guys'. Would you not be in a morally compromised position if you didn't use it? You're letting innocents die every moment you don't.

* warning definitions of bad guys may differ leading to further conflict.

replies(1): >>42941886 #
3. siltcakes ◴[] No.42941874[source]
The reality on the ground is that one of the very first uses of AI weapons was to target civilians in Gaza:

> Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes — usually at night while their whole families were present — rather than during the course of military activity.

https://www.972mag.com/lavender-ai-israeli-army-gaza/

replies(1): >>42942015 #
4. tmnvdb ◴[] No.42941886[source]
The logic of this argument implies we should develop weapons with maximal collateral damage to deter their usage.
replies(1): >>42941929 #
5. pixl97 ◴[] No.42941929{3}[source]
Which begs the question of if we can escape the Red Queen hypothesis.

Personally I don't think we can as a species.

6. tmnvdb ◴[] No.42942015[source]
Lavender is not an autonomous weapon but if you want to seriously consider if Lavender is a good thing (I am undecided) you need to compare the effect of this operation with Lavender and the effect of doing the same operation without the Lavender system. Otherwise you run the risk of making arguments that in the end just boil down to 'weapons bad'.
7. aprilthird2021 ◴[] No.42944581[source]
> civilians are already getting blown up all the time by systems that do not even attempt to make any distinction between civilians and soldiers: artillery shells, mortars, landmines, rockets, etc.

Right, and everytime that happens because of miscalculations by our government they lose the very real and important public license to continue. Ultimately modern wars led by democracies are won by public desire to continue them. The American public can become very hesitant to wage war very fast, if we unleash Minority Report on the world for revenge