Then make sure you don’t read till the end of the article where this behavior is supported. Maybe it is just a coincidence that Teslas always record data except when there’s a suspicion they caused the crash, and then the data was lost, didn’t upload, was irrelevant, or self driving wasn’t involved.
> The YouTuber Mark Rober, a former engineer at Nasa, replicated this behaviour in an experiment on 15 March 2025. He simulated a range of hazardous situations, in which the Model Y performed significantly worse than a competing vehicle. The Tesla repeatedly ran over a crash-test dummy without braking. The video went viral, amassing more than 14m views within a few days.
> The real surprise came after the experiment. Fred Lambert, who writes for the blog Electrek, pointed out the same autopilot disengagement that the NHTSA had documented. “Autopilot appears to automatically disengage a fraction of a second before the impact as the crash becomes inevitable,” Lambert noted.
In my previous comment I was wondering why would anyone still trust Tesla’s claims and not realistically assume the worst. It’s because plenty of people will only worry about it when it happens to them. It’s not an issue in itself until after your burned to a crisp in your car.