←back to thread

367 points DustinEchoes | 3 comments | | HN request time: 0s | source
Show context
ugh123 ◴[] No.45909860[source]
>my dad is dead, because his family members were too naive to know that the thing they were instructed to do by the state was a false thing.

We're told a lot of things by "officials" not because it's correct, but because it holds the least legal liability for official parties involved, especially anything involving healthcare. These officials also sometimes include doctors, who work to protect themselves and the system first, and then patients.

replies(5): >>45909909 #>>45909954 #>>45909988 #>>45910116 #>>45910233 #
1. godelski ◴[] No.45910233[source]

  > We're told a lot of things by "officials" not because it's correct
Often these rules are in place because they are statistically correct.

What needs to be understood is that no rule can be so well written that there are no exceptions. Rules are guides. Understanding this we can understand why certain guidelines are created, because they are likely the right response 9/10 times. This is especially important when dealing with high stress and low information settings.

BUT being statistically correct does not mean correct. For example, if the operator had information about the ETA of the ambulance (we don't know this!) then the correct answer would have been to tell them to not wait. But if the operator had no information, then the correct decision is to say to wait.

The world is full of edge cases. This is a major contributor to Moravec's paradox and why bureaucracies often feel like they are doing idiotic things. Because you are likely working in a much more information rich environment than the robot was designed for or the bureaucratic rules were. The lesson here is to learn that our great advantage as humans is to be flexible. To trust in people. To train them properly but also empower them to make judgement calls. It won't work out all the time, but doing this tends to beat the statistical rate. The reason simply comes down to "boots on the ground" knowledge. You can't predict every situation and there's too many edge cases. So trust in the people you're already putting trust into and recognize that in the real world there's more information to formulate decisions. You can't rule from a spreadsheet no more than you can hike up a mountain with only a map. The map is important, but it isn't enough.

replies(1): >>45911191 #
2. kelnos ◴[] No.45911191[source]
This was exactly what I was thinking (though less eruditely) when I was reading the blog post. In this particular case, waiting for the ambulance led to a worse outcome, but I would not be surprised that, statistically, a you're better off waiting for the ambulance than trying to get to the hospital via other means.

But unfortunately:

> if the operator had information about the ETA of the ambulance (we don't know this!) then the correct answer would have been to tell them to not wait. But if the operator had no information, then the correct decision is to say to wait.

I expect the operator just is not allowed to give advice like that, even if they did have information on ambulance ETA. There could be liability if someone is advised to drive to the hospital, and something bad happens. Even if that bad thing would have happened regardless. I think that's a bad reason to do the situation-dependent incorrect thing, but that's unfortunately how the world works sometimes.

replies(1): >>45911340 #
3. godelski ◴[] No.45911340[source]

  > I expect the operator just is not allowed to give advice like that
Maybe, but that's why I tried to stress the end part of empowering the workers. Empowering your "people on the ground" and stressing how you can't rule from a spreadsheet.

I also want to say that I'm giving this advice as someone who loves math, data, and statistics. Someone who's taken and studied much more math than the average STEM major. It baffles me how people claim to be data oriented yet do not recognize how critical noise is. Noise is a literal measurement of uncertainty. We should strive to reduce noise, but its abolishment is quite literally impossible. It must be accounted for rather than ignored.

So that's why I'm giving this advice. It's because it's how you strategize based on the data. All data needs to be interpreted, scrutinized, and questioned. And constantly, because we're not in a static world. So the only way to deal with that unavoidable noise is to have adaptable mechanisms that can deal with the details and nuances that get fuzzy when you do large aggregations. In the real world the tail of distributions are long and heavy.

A rigid structure is brittle and weak. The strongest structures are flexible, even if they appear stiff for the most part. It doesn't matter if you're building a skyscraper, a bridge, a business, or an empire. This is a universal truth because we'll never be omniscient. As long as we're not omniscient there will is noise, and you have to deal with it