←back to thread

Why Tesla’s cars keep crashing

(www.theguardian.com)
131 points nickcotter | 1 comments | | HN request time: 0.21s | source
Show context
close04 ◴[] No.44470797[source]
> “Autopilot appears to automatically disengage a fraction of a second before the impact as the crash becomes inevitable,”

This is probably core to their legal strategy. No matter how much data the cars collect they can always safely destroy most because this allows them to pretend the autonomous driving systems weren’t involved in the crash.

At this point it’s beyond me why people still trust the brand and the system. Musk really only disrupted the “fake it” part of “fake it till you make it”.

replies(4): >>44470804 #>>44471279 #>>44474149 #>>44476153 #
Dylan16807 ◴[] No.44470804[source]
I'll worry about that possible subterfuge if it actually happens a single time ever.

It's something to keep in mind but it's not an issue itself.

replies(1): >>44470888 #
close04 ◴[] No.44470888[source]
Then make sure you don’t read till the end of the article where this behavior is supported. Maybe it is just a coincidence that Teslas always record data except when there’s a suspicion they caused the crash, and then the data was lost, didn’t upload, was irrelevant, or self driving wasn’t involved.

> The YouTuber Mark Rober, a former engineer at Nasa, replicated this behaviour in an experiment on 15 March 2025. He simulated a range of hazardous situations, in which the Model Y performed significantly worse than a competing vehicle. The Tesla repeatedly ran over a crash-test dummy without braking. The video went viral, amassing more than 14m views within a few days.

> The real surprise came after the experiment. Fred Lambert, who writes for the blog Electrek, pointed out the same autopilot disengagement that the NHTSA had documented. “Autopilot appears to automatically disengage a fraction of a second before the impact as the crash becomes inevitable,” Lambert noted.

In my previous comment I was wondering why would anyone still trust Tesla’s claims and not realistically assume the worst. It’s because plenty of people will only worry about it when it happens to them. It’s not an issue in itself until after your burned to a crisp in your car.

replies(1): >>44471899 #
Dylan16807 ◴[] No.44471899[source]
No, turning off autopilot during a crash isn't subterfuge. The subterfuge would be using that to lie about autopilot's involvement. I'm pretty sure that has never happened, and their past data has included anyone using autopilot in the vicinity of a crash, much more than one second.
replies(2): >>44472177 #>>44473189 #
atombender ◴[] No.44472177[source]
The article cites an example of a Tesla engineer dying in a crash where witnesses (including a survivor) say he had FSD turned on. Elon claimed the witnesses were wrong.
replies(1): >>44475009 #
1. Dylan16807 ◴[] No.44475009[source]
You mean this one? "The Tesla CEO claimed von Ohain had never downloaded the latest version of the software – so it couldn’t have caused the crash."

That quote isn't playing games about whether it was engaged or not. If that's a lie it's equally easy to make the lie whether the system disengages or stays engaged.

I'm taking issue with a very specific scenario, not claiming tesla is honest in general.