←back to thread

Accountability sinks

(aworkinglibrary.com)
493 points l0b0 | 1 comments | | HN request time: 0s | source
Show context
alilleybrinker ◴[] No.41892299[source]
Cathy O'Neil's "Weapons of Math Destruction" (2016, Penguin Random House) is a good companion to this concept, covering the "accountability sink" from the other side of those constructing or overseeing systems.

Cathy argues that the use of algorithm in some contexts permits a new scale of harmful and unaccountable systems that ought to be reigned in.

https://www.penguinrandomhouse.com/books/241363/weapons-of-m...

replies(4): >>41892714 #>>41892736 #>>41892843 #>>41900231 #
spencerchubb ◴[] No.41892736[source]
It's much easier to hold an algorithm accountable than an organization of humans. You can reprogram an algorithm. But good look influencing an organization to change
replies(2): >>41892817 #>>41893364 #
conradolandia ◴[] No.41892817[source]
That is not accountability. Can the algorithm be sent to jail if it commit crimes?
replies(3): >>41892911 #>>41892927 #>>41893184 #
Timwi ◴[] No.41892911[source]
Yes. Not literally of course, but it can be deleted/decommissioned, which is even more effective than temporary imprisonment (it's equivalent to death penalty but without the moral component obviously).
replies(1): >>41892953 #
1. hammock ◴[] No.41892953[source]
Why should it be obvious that the moral component is absent? Removing an algorithm is like reducing the set of choices available to society… roughly equivalent to a law or regulation, or worse, a destructive act of coercion. There are moral implications of laws even though laws are not human