Most active commenters
  • wat10000(7)
  • giancarlostoro(5)
  • buran77(3)

←back to thread

650 points clcaev | 34 comments | | HN request time: 1.241s | source | bottom
Show context
metaphor ◴[] No.45063162[source]
> Immediately after the wreck at 9:14 p.m. on April 25, 2019, the crucial data detailing how it unfolded was automatically uploaded to the company’s servers and stored in a vast central database, according to court documents. Tesla’s headquarters soon sent an automated message back to the car confirming that it had received the collision snapshot.

> Moments later, court records show, the data was just as automatically “unlinked” from the 2019 Tesla Model S at the scene, meaning the local copy was marked for deletion, a standard practice for Teslas in such incidents, according to court testimony.

Wow...just wow.

replies(5): >>45063302 #>>45063632 #>>45063687 #>>45063980 #>>45064115 #
A4ET8a8uTh0_v2 ◴[] No.45063302[source]
I am trying to imagine a scenario under which that is defensible and does not raise various questions including compliance, legal, retention. Not to mention, who were the people who put that code into production knowing it would do that.

edit: My point is that it was not one lone actor, who would have made that change.

replies(3): >>45063366 #>>45063389 #>>45064252 #
1. colejohnson66 ◴[] No.45063366[source]
Assuming no malice, I'd guess it's for space saving on the car's internal memory. If the data was uploaded off of the car, there’s no point keeping it in the car.
replies(5): >>45063520 #>>45063627 #>>45064037 #>>45064183 #>>45065363 #
2. wat10000 ◴[] No.45063520[source]
Sounds like a pretty standard telemetry upload. You transmit it, keep your copy until you get acknowledgement that it was received so you can retry if it went wrong, then delete it when it succeeds.

It’s just worded to make this sound sketchy. I bet ten bucks “unlinked” just refers to the standard POSIX call for deleting a file.

replies(5): >>45063580 #>>45063611 #>>45063712 #>>45063718 #>>45063727 #
3. tobias3 ◴[] No.45063580[source]
Sketchy is that then someone takes “affirmative action to delete” the data on the server as well.

Also this is not like some process crash dump where the computer keeps running after one process crashed.

This would be like an plane black box uploading its data to the manufacturer, then deleting itself after a plane crash.

replies(1): >>45063612 #
4. buran77 ◴[] No.45063611[source]
The process of collecting and uploading the data probably confuses a lot of non-technical readers even if it worked as per standard industry practices.

The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted. There's no technical reason to dispose of data related to a crash when you hold so much data on all of the cars in general.

Crash data in particular should be considered sacred, especially given the severity in this case. Ideally it should be kept both on the local black box and on the servers. But anything that leads to it being treated as instantly disposable everywhere, or even just claiming it was deleted, can only be malice.

replies(2): >>45063918 #>>45064063 #
5. wat10000 ◴[] No.45063612{3}[source]
I’ll bet another ten bucks that this is a generic implementation for all of their telemetry, not something special cased for crashes.

Deleting the data on the server is totally sketchy, but that’s not what the quoted section is about.

replies(2): >>45064051 #>>45065073 #
6. OutOfHere ◴[] No.45063627[source]
That's 100% wrong. In standard practice, collision files are to be "locked", prevented from local deletion.
replies(2): >>45064021 #>>45064043 #
7. alistairSH ◴[] No.45063712[source]
That might be the case but the article seems to indicate the system knew the data was generated from an accident. So, removing to save space on the car should now be a secondary concern.
8. joshcryer ◴[] No.45063718[source]
The problem with this is that it destroys any chain of evidence. Tesla "lost" this data, in fact. You would never want your "black box" in your car delete itself after uploading to some service because the service could go down, be hacked, or the provider could decide to withhold it, forcing you into a lengthy discovery / custody battle.

This data is yours. You were going the speed limit when the accident happened and everyone else claims you were speeding. It would take forever to clear your name or worse you could be convicted if the data was lost.

This is more of "you will own nothing" crap. And mainly so Tesla can cover its ass.

9. aredox ◴[] No.45063727[source]
It is a car. A vehicule which can be involved in a fatal accident. It is not a website. There is no "oversight", nor is it "pretty standard" to do it like that: when you don't think about what your system is actually doing (and that is the most charitable explanation), YOU ARE STILL RESPONSIBLE AS IF YOU HAD DONE IT ON PURPOSE.
replies(1): >>45063780 #
10. wat10000 ◴[] No.45063780{3}[source]
One of Tesla’s things is that their software is built by software people rather than by car people. This has advantages and disadvantages.

Maybe this is not appropriate for a car, but that doesn’t excuse the ridiculous breathless tone in the quoted text. It’s the worst purple prose making a boring system sound exciting and nefarious. They could have made your point without trying to make the unlink() call sound suspicious.

replies(4): >>45063958 #>>45064225 #>>45064231 #>>45064627 #
11. wat10000 ◴[] No.45063918{3}[source]
> The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted.

Exactly. The issue is deleting the data on the servers, not a completely mundane upload-then-delete procedure for phoning home. This should have been one sentence, but instead they make it read like a heist.

12. buran77 ◴[] No.45063958{4}[source]
> their software is built by software people rather than by car people

The rogue engineer defense worked so well for VW and Dieselgate.

The issue of missing crash data was raised repeatedly. Deleting or even just claiming it was deleted can only be a mistake the first time.

replies(1): >>45065786 #
13. phkahler ◴[] No.45064021[source]
>> That's 100% wrong. In standard practice, collision files are to be "locked", prevented from local deletion.

I worked a year in airbag control, and they recorded a bit of information if the bags were deployed - seatbelt buckle status was one thing. But I was under the impression there was no legal requirement for that. I'm sure it helps in court when someone tries to sue because they were injured by an airbag. The argument becomes not just "the bags comply with the law" but also "you weren't wearing your seatbelt". Regardless, I'm sure Tesla has a lot more data than that and there is likely no legal requirement to keep it - especially if it's been transferred reliably to the server.

14. giancarlostoro ◴[] No.45064037[source]
I think your answer is the most logical to me as a developer, we often miss simple things, the PM overlooks it, and so it goes into production this way. I don't think its malicious. Sometimes bugs just don't become obvious until things break. We have all found an unintended consequence of our code that had nothing wrong with it technically sooner or later.
15. giancarlostoro ◴[] No.45064043[source]
I don't think its wrong, have you ever pushed code that was technically correct, only to find months later that you, your PM, their manager, their boss' boss, etc all missed one edge case? You're telling me no software developer has ever done this?
replies(2): >>45064475 #>>45064726 #
16. dylan604 ◴[] No.45064051{4}[source]
How handling an automobile crash not as a special case is the weird part. Even in the <$50 dashcams from Amazon there is a feature to mark a recording as locked so the auto delete logic does not touch the locked file. Some of them even have automatic collision detection which locks the file for you.

How Tesla could say that detecting a collision and not locking all/any of the data is normal is just insane.

replies(1): >>45064777 #
17. giancarlostoro ◴[] No.45064063{3}[source]
> The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted. There's no technical reason to dispose of data related to a crash when you hold so much data on all of the cars in general.

My money is on nobody built a tool to look up the data, so they have it, they just can't easily find it.

replies(1): >>45064276 #
18. const_cast ◴[] No.45064183[source]
Dude we're at the point where cars are practically gathering data on the size of your big toe.

The performance ship sailed, like, 15 years ago. We're already storing about 10000000 more data than we need. And that's not even an exaggeration.

19. const_cast ◴[] No.45064225{4}[source]
There are software people who know what they're doing - some write flight software or medical equipment software. They know how to critically think about the processes of their systems in detail.

So either the problem is Tesla engineers are fucking stupid (doubtful) or this is a poor business/product design.

My money is on the latter.

20. throwway120385 ◴[] No.45064231{4}[source]
I'm a software person but I still take the car person approach when I know i'm building a car. You have a responsibility to understand the gravity of the enterprise you undertake and to take appropriate steps given that gravity. Ignorance shouldn't be a defense, and if you don't know what you don't know then god help you.
21. ◴[] No.45064276{4}[source]
22. buran77 ◴[] No.45064475{3}[source]
You discover it the day you a person dies and your relevant data is not there. Next time it's no longer a "missed edge case".
replies(1): >>45064730 #
23. kergonath ◴[] No.45064627{4}[source]
> One of Tesla’s things is that their software is built by software people rather than by car people. This has advantages and disadvantages.

So we just shrug because software boys gotta be software boys? That’s completely insane and a big reason why a lot of engineers roll their eyes about developers who want to be considered engineers.

Software engineers who work on projects that can kill people must act like the lives of other people depend on them doing their job seriously, because that is the case. Look at the aviation industry. Is it acceptable to have a bug in the avionics suite down planes at random and then delete the black boxes? It absolutely is not, and when anything like that happens shit gets serious (think 737 MAX).

The developers who designed the systems are responsible, and so are their managers who approved the changes, all the way to the top. This would not happen in a company with appropriate processes in place.

replies(1): >>45066067 #
24. OutOfHere ◴[] No.45064726{3}[source]
It's not an edge case; it's wanton criminal sabotage, destruction of evidence, and it deserves a prison sentence for anyone facilitating it at any level.
replies(1): >>45064738 #
25. giancarlostoro ◴[] No.45064730{4}[source]
In a perfect world where developers are omnipresent and all knowing sure? This isn't a perfect world. Heck, how do you account for the developer who coded it leaving the company, and now that code has been untouched for half a decade if not more, because nothing is seemingly wrong with the code, what then? Who realizes it needs to be changed? Nobody. The number of obscure bugs I find in legacy code that stump even the most experienced maintainers never ends.
replies(1): >>45065448 #
26. giancarlostoro ◴[] No.45064738{4}[source]
This is assuming malice out of the gate without any evidence, which is not what we do here on HN. If this is in fact maliciously done, please provide evidence.
27. immibis ◴[] No.45064777{5}[source]
That one's easy: nobody at Tesla cares about having this feature
28. pjob ◴[] No.45065073{4}[source]
That might not be a good bet. https://news.ycombinator.com/item?id=45063380
replies(1): >>45066085 #
29. ajross ◴[] No.45065363[source]
In point of fact eMMC wear failure was an actual bug in early Tesla MCUs. They were logging too much, so when the car reached (via routine use) a certain fill level the logging started running over the same storage again and again and the chips started failing.

It's very easy to imagine a response to this being (beyond "don't log so much") an audit layer to start automatically removing redundant data.

The externalities of the company are such that people want to ascribe malice, but this is a very routine kind of thing.

replies(1): >>45066711 #
30. matthewdgreen ◴[] No.45065448{5}[source]
There have been dozens of government investigations and lawsuits around Tesla crashes over the past decade (more likely hundreds or thousands, I'm just thinking of the ones that received significant national press and that I happened to notice.) In each of these cases, Tesla's data retention was questioned, sometimes by regulators and sometimes as a major legal question in the case. There is no way in 2025 that the retention process around crash data is some niche area of Tesla's code that the business leaders haven't thought about extremely carefully.

This is like saying "maybe nobody has recently looked at the ad-selection mechanism at Google." That's just not plausible.

31. wat10000 ◴[] No.45065786{5}[source]
I really should know better than to think that I can criticize a small part of an article without a bunch of people thinking that I'm defending everything the article discusses.
32. wat10000 ◴[] No.45066067{5}[source]
I completely agree about responsibility for life-critical systems. I wouldn't put this in that category, though. Even on airliners, black boxes aren't treated quite as critically as the stuff that'll kill you then and there. Consider the recent crash in Korea where the black box shut off because it was designed without any backup power if the engines failed, or the Alaska Airlines flight where the voice recording was overwritten because it wasn't shut off after landing.

I'd argue that this data is far less important in cars. Airline safety has advanced to the point where crashes are extremely rare and usually have a novel cause. Data recorders are important to be able to learn that cause and figure out how to prevent it from happening again. Car safety, on the other hand, is shit. We don't require rigorous training for the operators. Regulations are lax, and enforcement even more lax. Infrastructure is poor. We're unwilling to fix these things. Almost all safety efforts focus on making the vehicles more robust when collisions occur, and we're just starting to see some effort put into making the vehicles automatically avoid some collisions. What are we going to learn from this data in cars? "Driver didn't stop for a red light, hit cross traffic." "Driver was drunk." "Driver failed to see pedestrian because of bad intersection design which has been known for fifty years and never been fixed." It's useful for assigning liability but not very useful for saving lives. There's a ton of lower hanging fruit to go after before you start combing through vehicle telemetry to find unknown problems.

Even if you do consider it to be life-critical, uploading the data and then deleting the local copy once receipt is acknowledged seems completely fine, if the server infrastructure is solid. Better than only keeping a local copy, even. The issue there is that they either have inadequate controls allowing data to be deleted, or inadequate ability to retrieve data.

33. wat10000 ◴[] No.45066085{5}[source]
I don't see anything in that comment that would apply to what I said.
34. A4ET8a8uTh0_v2 ◴[] No.45066711[source]
This, I think, was the argument that seems most plausible to me ( without ascribing malice ). It brings its own set of issues, but even those issues make it more believable despite being problematic in their own right.