←back to thread

Waymos crash less than human drivers

(www.understandingai.org)
345 points rbanffy | 1 comments | | HN request time: 0s | source
Show context
mjburgess ◴[] No.43487426[source]
Waymos choose the routes, right?

The issue with self-driving is (1) how it generalises across novel environments without "highly-available route data" and provider-chosen routes; (2) how failures are correlated across machines.

In safe driving failures are uncorrelated and safety procedures generalise. We do not yet know if, say, using self-driving very widely will lead to conditions in which "in a few incidents" more people are killed in those incidents than were ever hypothetically saved.

Here, without any confidence intervals, we're told we've saved ~70 airbag incidents in 20 mil miles. A bad update to the fleet will easily eclipse that impact.

replies(13): >>43487464 #>>43487477 #>>43487508 #>>43487579 #>>43487600 #>>43487603 #>>43487655 #>>43487741 #>>43487758 #>>43487777 #>>43489023 #>>43491131 #>>43491352 #
seper8 ◴[] No.43487508[source]
Does waymo also choose the times of driving, and conditions? Or do they always drive, even at night and in heavy rain?
replies(3): >>43487594 #>>43487666 #>>43488166 #
1. maxerickson ◴[] No.43487594[source]
Correctly estimating capability is pretty safety positive.

Meaning, humans choosing to drive in more difficult conditions probably means they sometimes drive in conditions that they shouldn't.