So the Tesla detected the vehicle and the pedestrian, and then plans a path through them? Wow! How bad is this software?
So the Tesla detected the vehicle and the pedestrian, and then plans a path through them? Wow! How bad is this software?
You can take a Waymo any time of day in SF and they provide 1000s of successful rides daily
Not just detect a pedestrian and plan a path through them. Hit a pedestrian and plan a path through them to finish the job.
[1] As popularized in the movie The Mitchells vs. the Machines: https://m.youtube.com/watch?v=LaK_8-3pWKk
I'm curious; why does it matter to you how many man-hours Waymo spends on a functional service? Would it be disqualifying if it's "too much" in your estimation?
I might be willing to take a robotaxi because if something does happen it's not my fault. Same with a bus or train. But I won't trust FSD on my own car (LIDAR or no LIDAR), except in certain circumstances like highway driving, because if something did happen, I'd be at fault even if it was the FSD that failed.
You don't need a high level of accuracy for that - the rule is to not overtake a school bus, so that would equate to one illegal overtake for every 1000 (stopped) buses encountered. Also, not every illegal bus overtake would necessarily put a child in immediate danger.
Unless you're driving a Mercedes with Drive Pilot [1], in which case Mercedes accepts liability [2]. Drive Pilot is not FSD yet, but presumably as it acquires more capabilities Mercedes will continue their policy of accepting liability.
[1] https://www.mbusa.com/en/owners/manuals/drive-pilot
[2] https://www.roadandtrack.com/news/a39481699/what-happens-if-...
You certainly need that accuracy level for critical events.