>
In the annotated video played for the jury, the vehicle detects a vehicle about 170 feet away. A subsequent frame shows it detecting a pedestrian about 116 feet away. As McGee hurtles closer and closer, the video shows the Tesla planning a path through Angulo’s truck, right where he and his girlfriend were standing behind signs and reflectors highlighting the end of the road.So the Tesla detected the vehicle and the pedestrian, and then plans a path through them? Wow! How bad is this software?
This bad:
https://vimeo.com/1093113127/e1fb6c359cNot just detect a pedestrian and plan a path through them. Hit a pedestrian and plan a path through them to finish the job.
Woah! And just as bad is the Tesla didn't even detect it had run a kid over. So you're also guilty of a hit-and-run. Hitting a kid running out from behind a car is something you could argue a human might have done as well depending on the circumstances. But the human would not have continued driving on as nothing happened (well not unless they're monster).
Not just that, a human would have stopped for the school bus as required by law. So the errors are (1) not stopping the car for the school bus, (2) running over the kid, and (3) not detecting that it ran over a kid.
The problem is that this software has to be incredibly fault-tolerant, and with so much complexity that is extremely difficult. A 99.9% accuracy rate isn't good enough because I might kill a child for every 1000 school buses I pass. It's why we still have pilots in planes even if computers do a lot more of the work than before.
I have taken lots of rides in Waymo, and it has always been smooth. So it is possible. The problem with Tesla is that its CEO lays down dogmatic rules, such as humans only need eyes to drive, so his cars will only have cameras, no LiDAR. He needs to accept that robot cars cannot drive the way humans do. Robot cars can drive better than humans, but they have to do it in their own way.