←back to thread

449 points lemper | 1 comments | | HN request time: 0.217s | source
Show context
napolux ◴[] No.45036831[source]
The most deadly bug in history. If you know any other deadly bug, please share! I love these stories!
replies(8): >>45036858 #>>45036868 #>>45036872 #>>45036943 #>>45037179 #>>45037200 #>>45037786 #>>45038530 #
1. A1kmm ◴[] No.45037179[source]
Not even close. Israel apparently has AI bombing target intel & selection systems called Gospel and Lavender - https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai.... Claims are these systems have a selectivity of 90% per bombing, and they were willing to bomb up to 20 civilians per person classified by the system as a Hamas member. So assuming that is true, 90% of the time, they kill one Hamas member, and up to 20 innocents. 10% of the time, they kill up to 21 innocents and no Hamas members.

Killing 20 innocents and one Hamas member is not a bug - it is callous, but that's a policy decision and the software working as intended. But when it is a false positive (10% of the time), due to inadequate / outdated data and inadequate models, that could reasonably classified as a bug - so all 21 deaths for each of those bombings would count as deaths caused by a bug. Apparently (at least earlier versions) of Gospel were trained on positive examples that mean someone is a member of Hamas, but not on negative examples; other problems could be due to, for example, insufficient data, and interpolation outside the valid range (e.g. using pre-war data about, e.g. how quickly cell phones are traded, or people movements, when behaviour is different post-war).

I'd therefore estimate that deaths due to classification errors from those systems is likely in the thousands (out of the 60k+ Palestinian deaths in the conflict). Therac-25's bugs caused 6 deaths for comparison.