←back to thread

281 points nharada | 1 comments | | HN request time: 0.259s | source
Show context
greesil ◴[] No.45902531[source]
I have seen a Waymo do a very stupid thing where it darted across a busy street, and it left very little margin of error for the oncoming traffic, which happened to be a loaded dump truck that could not have stopped. The dump truck driver was clearly surprised. It was a move that I never would have made as a driver. Did they dial the aggression up? I'm sure they're safer than humans in aggregate as there are some dumb humans out there but it's not infallible.
replies(4): >>45902583 #>>45902670 #>>45903548 #>>45906523 #
1. toast0 ◴[] No.45903548[source]
That reminds me of the Feb 14, 2016 collision in Mountain View [1] (sorry for pdf, but it has the best images of articles I saw) between a Google self-driving car and a VTA articulated bus. TLDR, the software and the safety driver thought the bus would move out of the way because it was a big vehicle and a professional driver. From the report:

> Google said it has tweaked its software to "more deeply understand that buses and other large vehicles are less likely to yield to us than other types of vehicles."

Maybe that got lost.

[1] https://phys.org/news/2016-03-apnewsbreak-video-google-self-...