Most active commenters
  • labrador(9)
  • timewizard(4)
  • ggreer(3)
  • whamlastxmas(3)

←back to thread

Waymos crash less than human drivers

(www.understandingai.org)
345 points rbanffy | 56 comments | | HN request time: 0.966s | source | bottom
1. labrador ◴[] No.43487628[source]
I was initially skeptical about self-driving cars but I've been won over by Waymo's careful and thoughtful approach using visual cues, lidar, safety drivers and geo-fencing. That said I will never trust my life to a Tesla robotaxi that uses visual cues only and will drive into a wall painted to look like the road ahead like Wile E. Coyote. Beep beep.

Man Tests If Tesla Autopilot Will Crash Into Wall Painted to Look Like Road https://futurism.com/tesla-wall-autopilot

replies(7): >>43487811 #>>43488043 #>>43490629 #>>43490938 #>>43490978 #>>43491005 #>>43511057 #
2. KoolKat23 ◴[] No.43487811[source]
To be fair, I'm sure there's a few humans that would crash into a giant painted road in the middle of a straight road in the middle of nowhere. Humans crash due to less.
replies(3): >>43487868 #>>43491846 #>>43492159 #
3. labrador ◴[] No.43487868[source]
Well, I wouldn't and neither would a Waymo, but a Tesla did, which means it's no better than a bad human driver
replies(2): >>43490838 #>>43490953 #
4. ggreer ◴[] No.43488043[source]
Mark Rober's video is misleading. First, he used autopilot, not FSD. Second, he sped up to 42mph and turned on autopilot a few seconds before impact[1], but he edited the Youtube video to make it look like he started autopilot a from a standstill far away from the barrier. Third, there is an alert message on his screen. It's too small to read in the video, but it could be the "autopilot will not brake" alert that happens when you put your foot on the gas.

In the water test, Rober has the Tesla driving down the center of the road, straddling the double yellow line. Autopilot will not do this, and the internal shots of the car crop out the screen. He almost certainly manually drove the car through the water and into the dummy.

One person tried to reproduce Rober's Wile E. Coyote test using FSD. FSD v12 failed to stop, but FSD v13 detected the barrier and stopped in time.[2]

Lidar would probably improve safety, but Rober's video doesn't prove anything. He decided on an outcome before he made the video.

1. https://x.com/MarkRober/status/1901449395327094898

2. https://x.com/alsetcenter/status/1902816452773810409

replies(1): >>43488210 #
5. ggreer ◴[] No.43488238{3}[source]
The first tweet I linked to is Mark Rober's unedited video of the crash. The second tweet I linked to is a video of someone trying to reproduce the Wile E. Coyote test. Unless you think the videos are faked (one of which was posted by Mark Rober), I'm not sure what objection you're making.
replies(1): >>43488410 #
6. labrador ◴[] No.43488410{4}[source]
My objection is Elon Musk and Tesla superfans will go to great lengths to spin events in Musk and Tesla's favor and X is their mouth piece. I looked at the replies under Mark Rober's video and it's the typical flood of Musk and Tesla super fans raging at him. Someone needs to explain why Mark Rober would post a misleading test. He seems like a solid guy. People I respect follow him, such as Palmer Luckey, Leopold Aschenbrenner and Andrej Karpathy.

Let's get back to my main point, that Tesla's not having Lidar is stupid and I don't trust a self-driving car that can't adequately detect solid objects in it's environment

replies(1): >>43488987 #
7. ggreer ◴[] No.43488987{5}[source]
It's much harder to determine motive (which only exists in Mark Rober's mind) than to determine whether a video is misleading. You can look at Rober's Youtube video, compare it to the unedited video (which he only posted on Twitter), and see how he edited it so that people didn't realize he accelerated the car to 42mph and engaged autopilot a few seconds before impact. You can also watch the bit in his Youtube video where he explains that he tests autopilot, not FSD, despite the title of the video being Can You Fool A Self Driving Car?. And you can watch the video posted by the other guy who showed that the latest version of FSD passes the Wile E. Coyote test.

I'm not defending any of those replies to Rober. In fact I find it quite annoying when dogmatic, sneery people happen to share my views. But the content of those replies does not change the content of Rober's videos, nor does it change the content of the video showing FSD passing the test.

> Let's get back to my main point, that Tesla's not having Lidar is stupid and I don't trust a self-driving car that can't adequately detect solid objects in it's environment

In the video I linked to, the self-driving car did adequately detect solid objects in its environment. My main point is that your main point is based on a video that used non-self driving software engaged seconds before collision, edited and published to make people think it was FSD engaged much farther back from a standstill. And at least one other test (the water test) didn't even use autopilot, just manual driving. I don't know why Rober did that, but he did, and it tanks his credibility.

Again, I'm not arguing against lidar. I already said that lidar would probably improve safety. But Rober's video does not show that, as he didn't use Tesla's FSD software. The person who did showed that it stopped successfully.

In a world where lidar greatly improves safety, we would see the latest version of FSD go through the Wile E. Coyote barrier. That didn't happen, so we probably don't live in that world. In a world where lidar improves safety, though not as much, we'd see FSD stop successfully. And in a world where lidar doesn't improve safety (weird I know, but there could be issues with sensor fusion or lidar training data), we'd also see FSD stop successfully. Right now we don't know which of those worlds we live in. And we won't know until someone (probably Tesla) launches a vision-only robo taxi service. Then we can compare accident rates to get an idea of how much lidar improves safety. And if Tesla doesn't have a robo taxi service within the next year, that indicates that cameras alone aren't safe enough to run a robo taxi service.

replies(2): >>43489141 #>>43490981 #
8. labrador ◴[] No.43489141{6}[source]
Points well taken. My personal preference is to not ride in a self-driving car that relies on visual cues only. To each his or her own. I predict that some trusting individuals will have to die before Musk decides to add Lidar or similar.

I followed Mark Rober on X to learn more about him and possibly understand more about his Tesla tests. Maybe he's a Musk/Tesla hater like Thunderf00t, I don't know. (yes, I'm on X - for entertainment purposes only)

9. jksflkjl3jk3 ◴[] No.43490629[source]
> That said I will never trust my life to a Tesla robotaxi that uses visual cues only and will drive into a wall painted to look like the road ahead

If you can visually detect the painted wall, what makes you think that cameras on a Tesla can't be developed to do the same?

And are deliberately deceptive road features actually a common enough concern?

replies(1): >>43490748 #
10. renewiltord ◴[] No.43490639{3}[source]
Videos on YouTube are also not a reliable source. But your demand for rigor seems rather isolated.
replies(1): >>43491619 #
11. aoeusnth1 ◴[] No.43490748[source]
How about fog and rain?
replies(1): >>43522509 #
12. saurik ◴[] No.43490838{3}[source]
I think I might, and I'm surprised by how confident you are that you wouldn't.
replies(1): >>43491744 #
13. bob1029 ◴[] No.43490938[source]
I started digging into this rabbit hole and I found it fairly telling how much energy is being expended on social media over LiDAR vs no LiDAR. Much of it feels like sock puppetry led by Tesla investors and their couterparties.

I see this whole thing is a business viability narrative wherein Tesla would be even further under water if they were forced to admit that LiDAR may possess some degree of technical superiority and could provide a reliability and safety uplift. It must have taken millions of dollars in marketing budget to erase the customer experiences around the prior models of their cars that did have this technology and performed accordingly.

replies(6): >>43491873 #>>43492003 #>>43492092 #>>43492471 #>>43492878 #>>43500829 #
14. zeroday28 ◴[] No.43490953{3}[source]
> I wouldn't

Of course, that's why traffic accidents are called 'accidents.' Drivers wouldn't crash their cars, but they do.

¯\_(ツ)_/¯

15. UltraSane ◴[] No.43490978[source]
It is truly astonishing how much Musk hypes up the robotaxi when no Tesla has ever driving a single mile autonomously while Tesla was liable for crashing.
replies(1): >>43522485 #
16. UltraSane ◴[] No.43490981{6}[source]
Visual only FSD is a dead end.
17. Ferret7446 ◴[] No.43491005[source]
A wall painted to look like a road would likely cause human accidents and the painter would be very much criminally liable for them.

That said, I do think using only visual cues is a stupid self-imposed restriction. We shouldn't be making self-driving cars like humans, because humans suck horse testicles at driving.

replies(3): >>43491335 #>>43491665 #>>43493832 #
18. timewizard ◴[] No.43491335[source]
> because humans suck horse testicles at driving.

Hardly. We drive hundreds of billions of miles every month and trillions every year. In the US alone. You're more likely to die from each of the flu, diabetes or a stroke than a car accident.

If those don't get you, you are either going to get heart disease or cancer, or most likely, involve yourself in a fatal accident; which, will most likely be a fall of a roof or a ladder.

replies(3): >>43491421 #>>43491732 #>>43492190 #
19. michaelt ◴[] No.43491421{3}[source]
Is 40,000 deaths every year a lot?

IMHO it kinda is. It's 13x as many people as died in 9/11

replies(2): >>43491781 #>>43491955 #
20. labrador ◴[] No.43491619{4}[source]
Other entertainment sites don't claim to be the source of all truth like Elon Musk does of X and Grok. Just 4 hours ago he posted what he said on Rogan.

"Grok is aspirationally a maximally truth-seeking ai, even if that truth is like politically incorrect”

Meanwhile, he deletes your account if you offend him

replies(1): >>43492026 #
21. audunw ◴[] No.43491665[source]
The painted wall was just a gimmick to make the video entertaining. What’s more concerning is the performance in fog, rain and other visually challenging conditions.
replies(2): >>43492034 #>>43492496 #
22. Mawr ◴[] No.43491732{3}[source]
"1000C is not that hot, the Sun is hotter!"

If you have to reach that hard to make your point, it's not a great point.

Adding to the sibling's statistic of 40k deaths a year:

> Motor vehicle crashes were the leading cause of death for children and adolescents, representing 20% of all deaths.

(https://pmc.ncbi.nlm.nih.gov/articles/PMC6637963/)

23. jjav ◴[] No.43491744{4}[source]
If you watch the video, it would be blatantly obvious to any human that it is just a big poster across the road, completely fake. No human would fall for that but tesla does.
24. timewizard ◴[] No.43491781{4}[source]
> Is 40,000 deaths every year a lot?

No. It's 1.25 per 10,000 per capita. Most people understand the risk ahead of time and yet still choose to drive. They clearly don't think it is.

> It's 13x as many people as died in 9/11

And 50x 9/11 many people die of accidental self inflicted injury. This is an absurd metric.

replies(3): >>43492116 #>>43492338 #>>43493855 #
25. ourmandave ◴[] No.43491846[source]
To be more fair, some humans will drive into rivers if the gps map tells them to. =\
26. labrador ◴[] No.43491873[source]
"It's a feature, not a bug!"

I suspect it would be a major undertaking to add LiDAR at this point because none of their software is written to use it

27. dagw ◴[] No.43491955{4}[source]
Is 40,000 deaths every year a lot?

The only meaningful way to say is to compare it to other countries. Pr vehicle mile it is a lot more than many Western European countries and Canada, and a lot less than Mexico.

28. x187463 ◴[] No.43492003[source]
I use FSD every day and it has driven easily 98% of the miles on my model 3. I would never let it drive unsupervised. I honestly have no idea how they think they're ready for robotaxis. FSD is an incredible driver assistance system. It's actually a joy to use, but it's simply not capable of reliable unsupervised performance. A big reason, it struggles exactly where you think it would based on a vision only system. It needs a more robust mechanism of building it's world model.

A simple example. I was coming out of a business driveway, turning left onto a two lane road. It was dark out with no nearby street lights. There was a car approaching from the left. FSD could see that a car was coming. However, from the view of a camera, it was just a ball of light. There was no reasonable way the camera could discern the distance given the brightness of the headlights. I suspected this was the case and was prepared to intervene, but left FSD on to see how it would respond. Predictably, it attempted to pull out in front of the car and risked a collision.

That kind of thing simply can not be allowed to happen with a truly autonomous vehicle and would never happen with lidar.

Hell, just this morning on my way to work FSD was going run a flashing red light. It's probably 95% accurate with flashing reds, but that needs to be 100%. That being said, my understanding is the current model being trained has better temporal understanding such that flashing lights will be more comprehensible to the system. We'll see.

replies(2): >>43495616 #>>43497280 #
29. concordDance ◴[] No.43492026{5}[source]
> Meanwhile, he deletes your account if you offend him

Willing to bet this is not true.

replies(1): >>43495099 #
30. x187463 ◴[] No.43492034{3}[source]
I think the correct response for FSD would have been to stop in the situations presented in the video. That wasn't anything like normal fog, rain, or light obstruction. The situations they created were so extreme you simply couldn't operate a vehicle safely. That being said, the effectiveness and precision of lidar should be a legal requirement for autonomous vehicles.
replies(1): >>43493306 #
31. lnsru ◴[] No.43492092[source]
Tesla sold a million Model Ys last year. So having a safety increasing part like lidar would reduce the profit by hundreds millions. Removal of ultrasonic sensors saved Tesla tens of millions. Ok, model Y is a big car and I don’t aim for tightest parking spots anymore. But basically removal of anything is very profitable for Tesla. And vice versa adding something useful is very expensive.
replies(1): >>43492455 #
32. Peanuts99 ◴[] No.43492116{5}[source]
The US car fatalities per mile is double than the UK. It would at least be useful to ask why that might be. That's 40,000 people a year who have their lives cut short.
replies(2): >>43492674 #>>43497218 #
33. ndsipa_pomu ◴[] No.43492159[source]
There's a fun thread available here: https://road.cc/content/forum/car-crashes-building-please-po...

It's where a bunch of cycling nutters (I'm one of them) post local news stories where a driver has crashed into a building ("It wasn't wearing hi-viz!")

34. Ukv ◴[] No.43492190{3}[source]
Worldwide stats from https://www.who.int/news-room/fact-sheets/detail/road-traffi...:

> Approximately 1.19 million people die each year as a result of road traffic crashes.

> Road traffic injuries are the leading cause of death for children and young adults aged 5–29 years.

Falls from a ladder/roof do not come close to that as far as I've been able to find. They'd be a subset of falls from a height, which is a small subset of unintentional falls/slips, which is still globally under road accident deaths.

It's true that diabetes, strokes, heart disease, flu, etc. do cause more deaths, but we're really into the absolute biggest causes of death here. Killing fewer than strokes is the lowest of low bars.

I think there's also the argument to be made in terms of years of life lost/saved. If you prevent a road accident fatality, chances are that person will go on to live many more healthy years/decades. If you prevent a death by stroke, flu, or even an at-home fall, there is a greater chance that person is already in poor health (to have potentially died from that cause) and may only be gaining a few extra months.

replies(2): >>43493400 #>>43497143 #
35. ◴[] No.43492338{5}[source]
36. whamlastxmas ◴[] No.43492455{3}[source]
It’s saved hundreds of millions at minimum. LiDAR is incredibly expensive hardware which is why they’re making it work well without it - it would make the cost of the cars really uncompetitive while also looking incredibly silly like Waymos. No one would buy them
replies(1): >>43493410 #
37. whamlastxmas ◴[] No.43492471[source]
We all see our perspectives as getting quashed. I see the opposite of you - people pushing arguments that make no sense to me in terms of criticizing Tesla for not using lidar, which is an argument that seemingly deliberately glances over the very real and valid reasons for Tesla choosing not to use it
38. whamlastxmas ◴[] No.43492496{3}[source]
Reviews of the wall gimmick video also make it clear that the LiDAR car stopped because it detected the water, not the wall. And there are tons of videos of LiDAR cars coming to a complete stop in traffic because of steam from a manhole or light water spraying just off the side of the road. Also don’t get me started on the manufacturer of the lidar car being mark’s close friend, and had previously given mark millions of dollars for another project he did
39. Qwertious ◴[] No.43492674{6}[source]
It's street design. If you prioritize car throughput at any cost, even safety, then your streets will be less safe.
40. ◴[] No.43492878[source]
41. teeray ◴[] No.43493306{4}[source]
> That wasn't anything like normal fog, rain, or light obstruction

It does happen on occasion. Seasonally, sublimating snow banks can create fog that intense for hours if conditions are right. Also heavy smoke can create similar conditions.

42. Zigurd ◴[] No.43493400{4}[source]
Initially, I was enthusiastic about FSD because it really would have a positive social impact like curing malaria if it worked.

But, like curing a dread disease, it's often a long, difficult grind and not something that will for sure work by the end of this year for the last 10 years. No pharma company would get away with that hype.

43. IshKebab ◴[] No.43493410{4}[source]
Which is why it makes more sense for driverless cars to not be individually owned. At least for now.

It would be like owning your own bus.

44. consteval ◴[] No.43493832[source]
In addition, humans have a lot of senses. Not just 5 - but dozens. A lot of them working in the background, subconsciously. It’s why I can feel someone staring at me, even if I never explicitly saw them.
45. consteval ◴[] No.43493855{5}[source]
> yet still choose to drive

Obligatory “almost nobody in the US chooses to drive” comment.

Driving in the US is a lifeline. It’s closer to food and shelter than a product or action. Remaining economically afloat in the US without a car is extraordinarily difficult. Many people, especially poor people, would much rather lose their job or health insurance than their car.

46. labrador ◴[] No.43495099{6}[source]
I asked Grok to "please give me a list of X accounts Elon Musk has suspended because he did not like their content"

The result is too long to post here but here's a sample

"Chad Loder - Suspended November 2022. A left-wing activist identifying January 6 participants, Loder was banned after Musk reportedly pressured X’s trust and safety head, per Bloomberg. The content—exposing far-right figures Musk has since aligned with—may have clashed with his views, though no public Musk comment confirms this."

47. labrador ◴[] No.43495616{3}[source]
Your report matches many other real world reports I've read. I'm pretty good at day dreaming or thinking while driving, so having to keep my hands ready to take over while being completely alert that FSD might error would be a big downgrade in my driving experience. I'd rather drive myself where my subconscious muscle memory does the driving so my conscious mind can think about other things. Having to pay attention to what FSD was doing would be a drag and prevent me from relaxing.
48. timewizard ◴[] No.43497143{4}[source]
> Road traffic injuries are the leading cause of death for children and young adults aged 5–29 years.

That's not telling you what you think it is. A lot of those deaths are that person in a car on their own. Usually involving drugs or alcohol. It intentionally folds in "deaths caused by others" and "death caused by self" into the same category. It's not an appropriate statistic to base policy on.

> If you prevent a road accident fatality, chances are that person will go on to live many more healthy years/decades.

Chances are that person is going to kill themselves in a vehicle again later as you have failed to examine MODE of accident. Your analysis is entirely wrong.

replies(1): >>43499642 #
49. timewizard ◴[] No.43497218{6}[source]
The UK is far more serious about impaired and drunk driving than the US is.

The majority of those people who had their lives cut short cut it short themselves and didn't take anyone with them.

Likewise, that 40k includes 6k pedestrians and 6k motorcyclists.

You can't just take the 40,000 figure and do _anything_ with it because there are so many peculiar modes of accidents which /dominate/ that data set.

50. Nemi ◴[] No.43497280{3}[source]
And you trust that you will ALWAYS have the awareness of intervening if and when FSD does something life threatening? You are braver than I am.

I am willing to experiment in many ways with things in my life, but not WITH my life.

replies(1): >>43498510 #
51. nilkn ◴[] No.43498510{4}[source]
I've used FSD a lot. Supervising it is a skill that you actively develop and can get very good at. Some argue that if you have to supervise it, there's no point, but I disagree. I still use it for much of my daily commute even though I have to supervise it and occasionally intervene. It's still a significant net positive addition to the driving experience for me overall. I would legitimately consider using it a skill that improves with practice; there's a threshold of skill where it becomes a huge positive, but below that threshold it can be a negative.
52. Ukv ◴[] No.43499642{5}[source]
> That's not telling you what you think it is. A lot of those deaths are that person in a car on their own. [...]

Sure - going off of NHTSA figures it looks around 35%. There's also a lot of car passenger deaths (~15%), pedestrian deaths (~20%), and deaths of car drivers with passengers (~15%).

Not entirely sure the point of breaking it out like this, though. These are all still deaths that self-driving cars could in theory prevent, and so all seem appropriate to consider and base policy on.

> Chances are that person is going to kill themselves in a vehicle again later [...]

Unsafe drivers (under the influence, distracted, etc.) are disproportionately represented in fatalities, but that neither means most road accident fatalities are unsafe drivers nor that most unsafe drivers will have a fatal car crash. As far as I can tell, even a driver using amphetamines (increasing risk of a fatal crash 5X) still isn't more likely than not to die in a car crash (a very high bar).

Further, if the way the initial fatal crash was prevented was by prevalence of safe autonomous vehicles, the future crashes would also be similarly mitigated.

53. rangestransform ◴[] No.43500829[source]
Just because Tesla uses shitty 2MP sensors of 2013 vintage (at least for HW3) doesn’t mean that robotaxi levels of safety can’t be achieved with just modern cameras and radars (plural)

As someone in the industry, I find the LiDAR discussion distracting from meaningful discussions about redundancy and testing

54. labrador ◴[] No.43511057[source]
My conclusion: If Tesla drivers are comfortable with vision-only FSD, that’s fine — it’s their responsibility to supervise and intervene. But when Tesla wants to deploy a fully autonomous robotaxi with no human oversight, it should be subject to higher safety requirements, including an independent redundant sensing system like LiDAR. Passengers shouldn’t be responsible for supervising their own taxi ride.
55. mavhc ◴[] No.43522485[source]
Every new Tesla drives a single mile autonomously while Tesla is liable for crashing

https://www.youtube.com/watch?v=BO1XXRwp3mc

56. mavhc ◴[] No.43522509{3}[source]
Same as humans, drive slow enough that you can stop when you see something ahead. As demonstrated Lidar doesn't work in rain either.

HW4 Tesla stopped before the painting of a road https://futurism.com/someone-else-tested-tesla-crash-wall-pa...