←back to thread

Waymos crash less than human drivers

(www.understandingai.org)
345 points rbanffy | 1 comments | | HN request time: 0.3s | source
Show context
labrador ◴[] No.43487628[source]
I was initially skeptical about self-driving cars but I've been won over by Waymo's careful and thoughtful approach using visual cues, lidar, safety drivers and geo-fencing. That said I will never trust my life to a Tesla robotaxi that uses visual cues only and will drive into a wall painted to look like the road ahead like Wile E. Coyote. Beep beep.

Man Tests If Tesla Autopilot Will Crash Into Wall Painted to Look Like Road https://futurism.com/tesla-wall-autopilot

replies(7): >>43487811 #>>43488043 #>>43490629 #>>43490938 #>>43490978 #>>43491005 #>>43511057 #
1. labrador ◴[] No.43511057[source]
My conclusion: If Tesla drivers are comfortable with vision-only FSD, that’s fine — it’s their responsibility to supervise and intervene. But when Tesla wants to deploy a fully autonomous robotaxi with no human oversight, it should be subject to higher safety requirements, including an independent redundant sensing system like LiDAR. Passengers shouldn’t be responsible for supervising their own taxi ride.