←back to thread

410 points jjulius | 2 comments | | HN request time: 0.002s | source
Show context
AlchemistCamp ◴[] No.41889077[source]
The interesting question is how good self-driving has to be before people tolerate it.

It's clear that having half the casualty rate per distance traveled of the median human driver isn't acceptable. How about a quarter? Or a tenth? Accidents caused by human drivers are one of the largest causes of injury and death, but they're not newsworthy the way an accident involving automated driving is. It's all too easy to see a potential future where many people die needlessly because technology that could save lives is regulated into a greatly reduced role.

replies(20): >>41889114 #>>41889120 #>>41889122 #>>41889128 #>>41889176 #>>41889205 #>>41889210 #>>41889249 #>>41889307 #>>41889331 #>>41889686 #>>41889898 #>>41890057 #>>41890101 #>>41890451 #>>41893035 #>>41894281 #>>41894476 #>>41895039 #>>41900280 #
1. triyambakam ◴[] No.41889120[source]
Hesitation around self-driving technology is not just about the raw accident rate, but the nature of the accidents. Self-driving failures often involve highly visible, preventable mistakes that seem avoidable by a human (e.g., failing to stop for an obvious obstacle). Humans find such incidents harder to tolerate because they can seem fundamentally different from human error.
replies(1): >>41889173 #
2. crazygringo ◴[] No.41889173[source]
Exactly -- it's not just the overall accident rate, but the rate per accident type.

Imagine if self-driving is 10x safer on freeways, but on the other hand is 3x more likely to run over your dog in the driveway.

Or it's 5x safer on city streets overall, but actually 2x worse in rain and ice.

We're fundamentally wired for loss aversion. So I'd say it's less about what the total improvement rate is, and more about whether it has categorizable scenarios where it's still worse than a human.