←back to thread

Waymos crash less than human drivers

(www.understandingai.org)
345 points rbanffy | 7 comments | | HN request time: 0.408s | source | bottom
Show context
myflash13 ◴[] No.43492214[source]
I have always distrusted Waymo's and Tesla's claims of being safer. There are so many ways to fudge the numbers.

1. If the self-driving software chooses to disengage 60 seconds before it detects an anomaly and then crashes while technically not in self-driving mode, is that a fault of the software or human backup driver? This is a problem especially with Tesla, which will disengage and let the human takeover.

2. When Waymo claims to have driven X million "rider only" miles, is that because the majority of miles are on a highway which are easy to drive with cruise control? If only 1 mile of a trip is on the end-to-end "hard parts" that require a human for getting in and out tight city streets and parking lots, while 10 miles are on the highway, it is easy to rack up "rider only" miles. But those trips are not representative of true self driving trips.

3. Selective bias. Waymo only operates in 3-4 cities and only in chosen weather conditions? It’s easy to rack up impressive safety stats when you avoid places with harsh weather, poor signage, or complicated street patterns. But that’s not representative of real-world driving conditions most people encounter daily.

The NTSB should force them to release all of the raw data so we can do our own analysis. I would compare only full self-driving trips, end on end, on days with good weather, in the 3-4 cities that Waymo operates and then see how much better they fare.

replies(2): >>43492365 #>>43492964 #
decimalenough ◴[] No.43492365[source]
Don't conflate Waymo and Tesla. Tesla FSD is by and large garbage, while Waymo is the real thing. Specifically:

1. Waymo is autonomous 100% of the time. It is not possible for a human to actually drive the car: even if you dial in support, all they can do is pick from various routes suggested by the car.

2. No, I'd guesstimate 90%+ of Waymo's mileage is city driving. Waymo in SF operates exclusively on city streets, it doesn't use the highways at all. In Phoenix, they do operate on freeways, but this only started in 2024.

3. Phoenix is driving in easy mode, but San Francisco is emphatically not. Weatherwise there are worse places, but SF drivers need to contend with fog and rain, hilly streets, street parking, a messy grid with diagonal and one-way streets, lots of mentally ill and/or drugged up people doing completely unpredictable shit in the streets, etc.

replies(1): >>43492433 #
1. whamlastxmas ◴[] No.43492433[source]
Humans remotely operate Waymos all the time. And humans routinely have to physically drive to rescue Waymos that get stuck somewhere and start blocking traffic, and famously had like 12 of them blocking a single intersection for hours.

If you think FSD is garbage then you’ve clearly never used it recently. It routinely drives me absolutely everywhere, including parking, without me touching the wheel once. Tesla’s approach to self driving is significantly more scalable and practical than waymo, and the forever repeated misleading and tired arguments saying otherwise really confuse me, since they’re simply not founded in reality

replies(2): >>43492546 #>>43499548 #
2. decimalenough ◴[] No.43492546[source]
Waymo does not have remote operation capability. Here's a blog post from them explaining how "fleet response" works:

https://waymo.com/blog/2024/05/fleet-response/

It's possible to put the car in manual mode, but that requires a human behind the wheel.

I have a Tesla myself, and while it's a great car, it's a long, long way from actual autonomous driving and their own stats bear this out: it can manage 12-13 miles without driver interruption, while Waymo is clocking ~17,000. Hell, where I live, Autopilot can barely stay in lane.

replies(2): >>43495278 #>>43495389 #
3. 93po ◴[] No.43495278[source]
https://teslafsdtracker.com/

Community tracking shows 2600+ miles between critical disengagements in California, where the mapping is probably the best (if we're going to make a fair comparison to Waymo). Most recent firmware shows 98% of trips have no disengagement at all in California, too. If you made the operating zones extremely tight like Waymo, I'm sure it'd do even better.

Your link states this:

> In the most ambiguous situations ...[it] requests [to humans] to optimize the driving path. [Humans] can influence the Waymo's path, whether indirectly through indicating lane closures, explicitly requesting the AV use a particular lane, or, in the most complex scenarios, explicitly proposing a path for the vehicle to consider.

It's literally a human drawing a line on the map telling the car where to go, in the most manual of ways. It's not an xbox steering wheel and driving remotely, but it's absolutely the same concept with a different interface, including a remote brake button.

replies(1): >>43497315 #
4. 93po ◴[] No.43495389[source]
Will also add - the 17k miles is a self reported number with no way to verify that. That's also alleged miles between "critical intervention" and doesn't account for remote operators intervening all the time.

Additionally, Waymo's most recent quartlery report for California lists over 1300 incidents that had mandatory reporting by law. This includes 47 collisions, 40 of them being with another vehicle, 1 against a pedestrian, and 2 against bicycles:

https://www.cpuc.ca.gov/regulatory-services/licensing/transp...

That's for a single quarter, in very small sections of the state. If you made them operate globally like Tesla, there's be thousands of these.

replies(1): >>43497251 #
5. ra7 ◴[] No.43497251{3}[source]
Do you have a source for remote operators intervening all the time? The 17k miles number is also for their testing with a safety driver, which means it's inherently tougher than the environments they operate without a driver.

If you're including disengagements in the 1300 "incidents", then it's highly misleading. As you said, it's only 47 collisions over millions of miles that also includes collisions in manual mode during testing. If you look at the collision reports [1], most of them are Waymos getting rear ended while being stationary. Remember, they have to report every contact event, including minor contacts like debris hitting their cars [2].

Tesla likely has orders of magnitude more incidents. The thing with them is that they don't report any of these numbers. Tesla doesn't even count crashes in their (highly misleading) safety report that don't deploy airbags.

[1] https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...

[2] https://www.dmv.ca.gov/portal/file/waymo_021523-pdf/

6. ra7 ◴[] No.43497315{3}[source]
> Community tracking shows 2600+ miles between critical disengagements in California, where the mapping is probably the best (if we're going to make a fair comparison to Waymo).

This isn't a fair comparison either. FSD is used a lot on highways where crash rates are lower (and hence disengagements will be too). Waymo doesn't go on highways yet and can already go 17k miles without intervention (with a safety driver) in places harder than SF and LA where they're already driverless.

> It's not an xbox steering wheel and driving remotely, but it's absolutely the same concept with a different interface, including a remote brake button.

How do you know they have a "remote brake button"? Waymo's blog makes no mention of any such thing. They categorically say remote operators have no control over the vehicle.

I think you're deliberately trying to mislead with your comments here by slipping in something false with known facts.

7. kmacleod ◴[] No.43499548[source]
How do you get yours to drive into a parking lot and park correctly? My HW4/v13.2.8 gets indecisive and ignores lines when getting to a destination. I always have to disengage before I can use parking mode.