←back to thread

617 points jbegley | 1 comments | | HN request time: 0.246s | source
Show context
uejfiweun ◴[] No.42940754[source]
Dollar short and a day late. The future of the US tech industry belongs to those who weren't interested in performative woke nonsense like this during the last decade.
replies(4): >>42941043 #>>42941376 #>>42941529 #>>42941636 #
aprilthird2021 ◴[] No.42941376[source]
How is it woke nonsense to not want to create a weapon that probabilistically determines if a civilian looks close enough like a bad guy to missle strike them?
replies(3): >>42941543 #>>42941760 #>>42942262 #
tmnvdb ◴[] No.42941760[source]
This sentiment ignores the reality on the ground in favor of performative ideological purity - civilians are already getting blown up all the time by systems that do not even attempt to make any distinction between civilians and soldiers: artillery shells, mortars, landmines, rockets, etc.
replies(3): >>42941816 #>>42941874 #>>42944581 #
siltcakes ◴[] No.42941874[source]
The reality on the ground is that one of the very first uses of AI weapons was to target civilians in Gaza:

> Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes — usually at night while their whole families were present — rather than during the course of military activity.

https://www.972mag.com/lavender-ai-israeli-army-gaza/

replies(1): >>42942015 #
1. tmnvdb ◴[] No.42942015[source]
Lavender is not an autonomous weapon but if you want to seriously consider if Lavender is a good thing (I am undecided) you need to compare the effect of this operation with Lavender and the effect of doing the same operation without the Lavender system. Otherwise you run the risk of making arguments that in the end just boil down to 'weapons bad'.