←back to thread

Waymos crash less than human drivers

(www.understandingai.org)
345 points rbanffy | 4 comments | | HN request time: 1.888s | source
Show context
mjburgess ◴[] No.43487426[source]
Waymos choose the routes, right?

The issue with self-driving is (1) how it generalises across novel environments without "highly-available route data" and provider-chosen routes; (2) how failures are correlated across machines.

In safe driving failures are uncorrelated and safety procedures generalise. We do not yet know if, say, using self-driving very widely will lead to conditions in which "in a few incidents" more people are killed in those incidents than were ever hypothetically saved.

Here, without any confidence intervals, we're told we've saved ~70 airbag incidents in 20 mil miles. A bad update to the fleet will easily eclipse that impact.

replies(13): >>43487464 #>>43487477 #>>43487508 #>>43487579 #>>43487600 #>>43487603 #>>43487655 #>>43487741 #>>43487758 #>>43487777 #>>43489023 #>>43491131 #>>43491352 #
1. shadowgovt ◴[] No.43489023[source]
I usually think about it in the other direction: every time an accident occurs, a human learns something novel (even if it be a newfound appreciation of their own mortality) that can't be directly transmitted to other humans. Our ability to take collective driving wisdom and dump it into the mind of every learner's-permit-holder is woefully inadequate.

In contrast, every time a flaw is discovered in a self-driving algorithm, the whole fleet of vehicles is one over-the-air update away from getting safer.

replies(1): >>43490588 #
2. codr7 ◴[] No.43490588[source]
And it goes the other way too, one crappy update means complete chaos.
replies(1): >>43494377 #
3. shadowgovt ◴[] No.43494377[source]
There is already industry best-practice, in and out of self-driving cars, to avoid doing that.
replies(1): >>43495922 #
4. codr7 ◴[] No.43495922{3}[source]
Hey, I've written software for a living for 26 years.

No best practice in the world is going to stop people from making mistakes.