←back to thread

Waymos crash less than human drivers

(www.understandingai.org)
345 points rbanffy | 1 comments | | HN request time: 0.272s | source
Show context
myflash13 ◴[] No.43492214[source]
I have always distrusted Waymo's and Tesla's claims of being safer. There are so many ways to fudge the numbers.

1. If the self-driving software chooses to disengage 60 seconds before it detects an anomaly and then crashes while technically not in self-driving mode, is that a fault of the software or human backup driver? This is a problem especially with Tesla, which will disengage and let the human takeover.

2. When Waymo claims to have driven X million "rider only" miles, is that because the majority of miles are on a highway which are easy to drive with cruise control? If only 1 mile of a trip is on the end-to-end "hard parts" that require a human for getting in and out tight city streets and parking lots, while 10 miles are on the highway, it is easy to rack up "rider only" miles. But those trips are not representative of true self driving trips.

3. Selective bias. Waymo only operates in 3-4 cities and only in chosen weather conditions? It’s easy to rack up impressive safety stats when you avoid places with harsh weather, poor signage, or complicated street patterns. But that’s not representative of real-world driving conditions most people encounter daily.

The NTSB should force them to release all of the raw data so we can do our own analysis. I would compare only full self-driving trips, end on end, on days with good weather, in the 3-4 cities that Waymo operates and then see how much better they fare.

replies(2): >>43492365 #>>43492964 #
decimalenough ◴[] No.43492365[source]
Don't conflate Waymo and Tesla. Tesla FSD is by and large garbage, while Waymo is the real thing. Specifically:

1. Waymo is autonomous 100% of the time. It is not possible for a human to actually drive the car: even if you dial in support, all they can do is pick from various routes suggested by the car.

2. No, I'd guesstimate 90%+ of Waymo's mileage is city driving. Waymo in SF operates exclusively on city streets, it doesn't use the highways at all. In Phoenix, they do operate on freeways, but this only started in 2024.

3. Phoenix is driving in easy mode, but San Francisco is emphatically not. Weatherwise there are worse places, but SF drivers need to contend with fog and rain, hilly streets, street parking, a messy grid with diagonal and one-way streets, lots of mentally ill and/or drugged up people doing completely unpredictable shit in the streets, etc.

replies(1): >>43492433 #
whamlastxmas ◴[] No.43492433[source]
Humans remotely operate Waymos all the time. And humans routinely have to physically drive to rescue Waymos that get stuck somewhere and start blocking traffic, and famously had like 12 of them blocking a single intersection for hours.

If you think FSD is garbage then you’ve clearly never used it recently. It routinely drives me absolutely everywhere, including parking, without me touching the wheel once. Tesla’s approach to self driving is significantly more scalable and practical than waymo, and the forever repeated misleading and tired arguments saying otherwise really confuse me, since they’re simply not founded in reality

replies(2): >>43492546 #>>43499548 #
decimalenough ◴[] No.43492546[source]
Waymo does not have remote operation capability. Here's a blog post from them explaining how "fleet response" works:

https://waymo.com/blog/2024/05/fleet-response/

It's possible to put the car in manual mode, but that requires a human behind the wheel.

I have a Tesla myself, and while it's a great car, it's a long, long way from actual autonomous driving and their own stats bear this out: it can manage 12-13 miles without driver interruption, while Waymo is clocking ~17,000. Hell, where I live, Autopilot can barely stay in lane.

replies(2): >>43495278 #>>43495389 #
93po ◴[] No.43495278[source]
https://teslafsdtracker.com/

Community tracking shows 2600+ miles between critical disengagements in California, where the mapping is probably the best (if we're going to make a fair comparison to Waymo). Most recent firmware shows 98% of trips have no disengagement at all in California, too. If you made the operating zones extremely tight like Waymo, I'm sure it'd do even better.

Your link states this:

> In the most ambiguous situations ...[it] requests [to humans] to optimize the driving path. [Humans] can influence the Waymo's path, whether indirectly through indicating lane closures, explicitly requesting the AV use a particular lane, or, in the most complex scenarios, explicitly proposing a path for the vehicle to consider.

It's literally a human drawing a line on the map telling the car where to go, in the most manual of ways. It's not an xbox steering wheel and driving remotely, but it's absolutely the same concept with a different interface, including a remote brake button.

replies(1): >>43497315 #
1. ra7 ◴[] No.43497315[source]
> Community tracking shows 2600+ miles between critical disengagements in California, where the mapping is probably the best (if we're going to make a fair comparison to Waymo).

This isn't a fair comparison either. FSD is used a lot on highways where crash rates are lower (and hence disengagements will be too). Waymo doesn't go on highways yet and can already go 17k miles without intervention (with a safety driver) in places harder than SF and LA where they're already driverless.

> It's not an xbox steering wheel and driving remotely, but it's absolutely the same concept with a different interface, including a remote brake button.

How do you know they have a "remote brake button"? Waymo's blog makes no mention of any such thing. They categorically say remote operators have no control over the vehicle.

I think you're deliberately trying to mislead with your comments here by slipping in something false with known facts.