←back to thread

Why Tesla’s cars keep crashing

(www.theguardian.com)
131 points nickcotter | 7 comments | | HN request time: 0s | source | bottom
Show context
close04 ◴[] No.44470797[source]
> “Autopilot appears to automatically disengage a fraction of a second before the impact as the crash becomes inevitable,”

This is probably core to their legal strategy. No matter how much data the cars collect they can always safely destroy most because this allows them to pretend the autonomous driving systems weren’t involved in the crash.

At this point it’s beyond me why people still trust the brand and the system. Musk really only disrupted the “fake it” part of “fake it till you make it”.

replies(4): >>44470804 #>>44471279 #>>44474149 #>>44476153 #
1. Dylan16807 ◴[] No.44470804[source]
I'll worry about that possible subterfuge if it actually happens a single time ever.

It's something to keep in mind but it's not an issue itself.

replies(1): >>44470888 #
2. close04 ◴[] No.44470888[source]
Then make sure you don’t read till the end of the article where this behavior is supported. Maybe it is just a coincidence that Teslas always record data except when there’s a suspicion they caused the crash, and then the data was lost, didn’t upload, was irrelevant, or self driving wasn’t involved.

> The YouTuber Mark Rober, a former engineer at Nasa, replicated this behaviour in an experiment on 15 March 2025. He simulated a range of hazardous situations, in which the Model Y performed significantly worse than a competing vehicle. The Tesla repeatedly ran over a crash-test dummy without braking. The video went viral, amassing more than 14m views within a few days.

> The real surprise came after the experiment. Fred Lambert, who writes for the blog Electrek, pointed out the same autopilot disengagement that the NHTSA had documented. “Autopilot appears to automatically disengage a fraction of a second before the impact as the crash becomes inevitable,” Lambert noted.

In my previous comment I was wondering why would anyone still trust Tesla’s claims and not realistically assume the worst. It’s because plenty of people will only worry about it when it happens to them. It’s not an issue in itself until after your burned to a crisp in your car.

replies(1): >>44471899 #
3. Dylan16807 ◴[] No.44471899[source]
No, turning off autopilot during a crash isn't subterfuge. The subterfuge would be using that to lie about autopilot's involvement. I'm pretty sure that has never happened, and their past data has included anyone using autopilot in the vicinity of a crash, much more than one second.
replies(2): >>44472177 #>>44473189 #
4. atombender ◴[] No.44472177{3}[source]
The article cites an example of a Tesla engineer dying in a crash where witnesses (including a survivor) say he had FSD turned on. Elon claimed the witnesses were wrong.
replies(1): >>44475009 #
5. close04 ◴[] No.44473189{3}[source]
Turning off the system just before a crash when it’s unavoidable allows them to say “the system wasn’t active when the crash occurred” and implicitly label a lot of data “irrelevant”. Which they do a lot of times according to the article, without providing any of that data. That’s beyond subterfuge. They don’t just kill people, they destroy evidence of their guilt and shift the blame to the victim. How much stock one needs to own to pretend they don’t understand this?

Tesla bragged about the cars giving a ton of data, and showed it when this suited the company and it was good for the image. But every time it was controversial like an unexplainable accident potentially caused by the car itself the data was somehow not transmitted, or lost, or irrelevant.

I’m not sure why you have such a hard time understanding the issue, or insist on what you’re “pretty sure” about when all evidence (they cite the NHTSA and experiments conducted privately by a NASA engineer, as well as the string of coincidental data unavailability for controversial accidents) points to the contrary. The article provides evidence and discussion on all these points. Nonetheless you ignore all that and stick to your “I’m pretty sure” with fanboy abandon. Sets a really low bar for future conversations.

replies(1): >>44475037 #
6. Dylan16807 ◴[] No.44475009{4}[source]
You mean this one? "The Tesla CEO claimed von Ohain had never downloaded the latest version of the software – so it couldn’t have caused the crash."

That quote isn't playing games about whether it was engaged or not. If that's a lie it's equally easy to make the lie whether the system disengages or stays engaged.

I'm taking issue with a very specific scenario, not claiming tesla is honest in general.

7. Dylan16807 ◴[] No.44475037{4}[source]
> Turning off the system just before a crash when it’s unavoidable allows them to say “the system wasn’t active when the crash occurred”

In theory. Maybe.

Have they ever done that?

You're citing entirely different bad behavior. That's not evidence for my question. The article has claims of stonewalling and claiming no data at all and one case where they said the software wasn't even installed, but those are not the scenario I asked about.

Calling me a tesla fanboy for wanting evidence for the correct claim instead of a completely different claim is pretty ridiculous. I'm not being pro tesla here.

And the reason I said "pretty sure" is that people bring up that scenario over and over and over, but nobody has ever shown an example of it being real, despite having tons of examples of other tesla problems.