←back to thread

Accountability sinks

(aworkinglibrary.com)
493 points l0b0 | 1 comments | | HN request time: 0.203s | source
Show context
xg15 ◴[] No.41893729[source]
My suspicion I'd that one of the major appeals of automation and especially "app-ification" for management and C-Suite types is specifically its ability to break accountability links.

A lot of corporations now seem to have a structure where the org chart contains the following pattern:

- a "management layer" (or several of them) which consists of product managers, software developers, ops people, etc. The main task of this group is to maintain and implement new features for the "software layer", i.e. the company's in-house IT infrastructure.

Working here feels very much like working in a tech company.

- a "software layer": This part is fully automated and consists of a massive software and hardware infrastructure that runs the day-to-day business of the company. The software layer has "interfaces" in the shape of specialized apps or devices that monitor and control the people in the "worker's layer".

- a "worker's layer": This group is fully human again. It consists of low-paid, frequently changing staff who perform most of the actual physical work that the business requires (and that can't be automated away yet) - think Uber drivers, delivery drivers, Amazon warehouse workers, etc.

They have no contact at all with the management layer and little contact, if any, with human higher-ups. They get almost all their instructions through the apps and other interfaces of the software layer. Companies frequently dispute that those people technically belong to the company at all.

Whether or not those people are classified as employees, the important point (from the management's POV) is that the software layer serves as a sort of "accountability firewall" between the other two layers.

Management only gives the high-level goal of how the software should perform, but the actual day-to-day interaction with the workers is exclusively done by the software itself.

The result is that any complaints from the worker's layer cannot go up past the software - and any exploitative behavior towards the workers can be chalked up as an unfortunate software error.

replies(3): >>41893751 #>>41894291 #>>41894464 #
1. tomaskafka ◴[] No.41893751[source]
That’s what @vgr observed some time ago - people split into “above AI” and “below AI”, and the AI slowly moves up in the stack.