A bit OT, but what a gorgeous whale of a sentence! As always, the literary prowess of NTSB writers does not disappoint.
A bit OT, but what a gorgeous whale of a sentence! As always, the literary prowess of NTSB writers does not disappoint.
In the context of a summary I just expect the core sentence to take events in order from the headline failure ("in-flight exit door plug separation") and then work back to the root cause.
Yes - zooming out it important and ultimately where actionable remediation can be applied - but blame is due where blame is due: somebody fucked up at work and it almost brought down a plane.
That's why these reports tend to suggest corrective actions to the parts of the system that didn't work properly. Even in a perfectly functioning safety culture, an employee can make a mistake and forget to install the bolts. A functioning safety system has safeguards in place to ensure that mistake is found and corrected.
In aviation and other safety-critical fields, we use a just culture approach — not to avoid accountability, but to ensure that learning and prevention come first.
And a relatively straightforward corollary of that reality is that, when somebody fucks up, putting too much personal blame on them is pointless. If it weren't them, it would have been somebody else.
In other words, this "blame is due where blame is due" framing is mostly useful as a cop-out excuse that helps incompetent managers who've been skimping on quality controls and failsafes to shift the blame away from where it really belongs.
Doesn't this mean it should happen a lot more?
The system allowed the human to take the incorrect action. If your intern destroys your prod database, it's because you failed to restrict access to the prod database. The remediation to "my intern is capable of destroying my prod database" is not "fire the intern" it's "restrict access to the prod db".
Even the best trained humans will make errors. They will make errors stochastically. Your systemic safety checks will guard against those errors becoming problems. If your safety culture requires all humans to be flawless 100% of the time, your safety culture sucks.
So no, this isn't a fault with a human. Because this was a possible error, it was inevitable that at some point a human would make that error. Because humans never operate without errors for extended periods of time.
In particular, the original formulation of Murphy's Law. The folk version has morphed into "anything that can go wrong, will go wrong". But the original was "If there are two or more ways to do something and one of those results in a catastrophe, then someone will do it that way".
The problem with a culture which prioritizes "blame is due where blame is due" is it can cause people to not report near-misses and other gaps as well as cover-up actual mistakes. The shift in the U.S. from blaming (and penalizing) occasional pilot lapses to a more 'blameless' default mode was controversial but has now clearly demonstrated that it nets better overall safety.
I don’t see it that way. It’s designed for consumption by educated readers. A press release can dumb it down to middle-school reading level so the media can dumb it down to grade-school level for the masses.
vs.
Figure out who to sue.