←back to thread

410 points jjulius | 1 comments | | HN request time: 0s | source
Show context
AlchemistCamp ◴[] No.41889077[source]
The interesting question is how good self-driving has to be before people tolerate it.

It's clear that having half the casualty rate per distance traveled of the median human driver isn't acceptable. How about a quarter? Or a tenth? Accidents caused by human drivers are one of the largest causes of injury and death, but they're not newsworthy the way an accident involving automated driving is. It's all too easy to see a potential future where many people die needlessly because technology that could save lives is regulated into a greatly reduced role.

replies(20): >>41889114 #>>41889120 #>>41889122 #>>41889128 #>>41889176 #>>41889205 #>>41889210 #>>41889249 #>>41889307 #>>41889331 #>>41889686 #>>41889898 #>>41890057 #>>41890101 #>>41890451 #>>41893035 #>>41894281 #>>41894476 #>>41895039 #>>41900280 #
aithrowawaycomm ◴[] No.41889307[source]
Many people don't (and shouldn't) take the "half the casualty rate" at face value. My biggest concern is that Waymo and Tesla are juking the stats to make self-driving cars seem safer than they really are. I believe this is largely an unintentional consequence of bad actuary science coming from bad qualitative statistics; the worst kind of lying with numbers is lying to yourself.

The biggest gap in these studies: I have yet to see a comparison with human drivers that filters out DUIs, reckless speeding, or mechanical failures. Without doing this it is simply not a fair comparison, because:

1) Self-driving cars won't end drunk driving unless it's made mandatory by outlawing manual driving or ignition is tied to a breathalyzer. Many people will continue to make the dumb decision to drive themselves home because they are drunk and driving is fun. This needs regulation, not technology. And DUIs need to be filtered from the crash statistics when comparing with Waymo.

2) A self-driving car which speeds and runs red lights might well be more dangerous than a similar human, but the data says nothing about this since Waymo is currently on their best behavior. Yet Tesla's own behavior and customers prove that there is demand for reckless self-driving cars, and manufacturers will meet the demand unless the law steps in. Imagine a Waymo competitor that promises Uber-level ETAs for people in a hurry. Technology could in theory solve this but in practice the market could make things worse for several decades until the next research breakthrough. Human accidents coming from distraction are a fair comparison to Waymo, but speeding or aggressiveness should be filtered out. The difficulty of doing so is one of the many reasons I am so skeptical of these stats.

3) Mechanical failures are a hornets' nest of ML edge cases that might work in the lab but fail miserably on the road. Currently it's not a big deal because the cars are shiny and new. Eventually we'll have self-driving clunkers owned by drivers who don't want to pay for the maintenance.

And that's not even mentioning that Waymos are not self-driving, they rely on close remote oversight to guide AI through the many billions of common-sense problems that computets will not able to solve for at least the next decade, probably much longer. True self-driving cars will continue to make inexplicably stupid decisions: these machines are still much dumber than lizards. Stories like "the Tesla slammed into an overturned tractor trailer because the AI wasn't trained on overturned trucks" are a huge problem and society will not let Tesla try to launder it away with statistics.

Self-driving cars might end up saving lives. But would they save more lives than adding mandatory breathalyzers and GPS-based speed limits? And if market competition overtakes business ethics, would they cost more lives than they save? The stats say very little about this.

replies(1): >>41894458 #
1. kelnos ◴[] No.41894458[source]
> My biggest concern is that Waymo and Tesla are juking the stats to make self-driving cars seem safer than they really are

Even intentional juking aside, you can't really compare the two.

Waymo cars drive completely autonomously, without a supervising driver in the car. If it does something unsafe, there's no one there to correct it, and it may get into a crash, in the same way a human driver doing that same unsafe thing might.

With Tesla FSD, we have no idea how good it really is. We know that a human is supervising it, and despite all the reports we see of people doing super irresponsible things while "driving" a Tesla (like taking a nap), I imagine most Tesla FSD users are actually attentively supervising for the most part. If all FSD users stopped supervising and started taking naps, I suspect the crash rate and fatality rate would start looking like the rate for the worst drivers on the road... or even worse than that.

So it's not that they're juking their stats (although they may be), it's that they don't actually have all the stats that matter. Waymo has and had those stats, because their trained human test drivers were reporting when the car did something unsafe and they had to take over. Tesla FSD users don't report when they have to do that. The data is just not there.