Most active commenters
  • wat10000(7)

←back to thread

650 points clcaev | 21 comments | | HN request time: 0.001s | source | bottom
Show context
metaphor ◴[] No.45063162[source]
> Immediately after the wreck at 9:14 p.m. on April 25, 2019, the crucial data detailing how it unfolded was automatically uploaded to the company’s servers and stored in a vast central database, according to court documents. Tesla’s headquarters soon sent an automated message back to the car confirming that it had received the collision snapshot.

> Moments later, court records show, the data was just as automatically “unlinked” from the 2019 Tesla Model S at the scene, meaning the local copy was marked for deletion, a standard practice for Teslas in such incidents, according to court testimony.

Wow...just wow.

replies(5): >>45063302 #>>45063632 #>>45063687 #>>45063980 #>>45064115 #
A4ET8a8uTh0_v2 ◴[] No.45063302[source]
I am trying to imagine a scenario under which that is defensible and does not raise various questions including compliance, legal, retention. Not to mention, who were the people who put that code into production knowing it would do that.

edit: My point is that it was not one lone actor, who would have made that change.

replies(3): >>45063366 #>>45063389 #>>45064252 #
colejohnson66 ◴[] No.45063366[source]
Assuming no malice, I'd guess it's for space saving on the car's internal memory. If the data was uploaded off of the car, there’s no point keeping it in the car.
replies(5): >>45063520 #>>45063627 #>>45064037 #>>45064183 #>>45065363 #
1. wat10000 ◴[] No.45063520[source]
Sounds like a pretty standard telemetry upload. You transmit it, keep your copy until you get acknowledgement that it was received so you can retry if it went wrong, then delete it when it succeeds.

It’s just worded to make this sound sketchy. I bet ten bucks “unlinked” just refers to the standard POSIX call for deleting a file.

replies(5): >>45063580 #>>45063611 #>>45063712 #>>45063718 #>>45063727 #
2. tobias3 ◴[] No.45063580[source]
Sketchy is that then someone takes “affirmative action to delete” the data on the server as well.

Also this is not like some process crash dump where the computer keeps running after one process crashed.

This would be like an plane black box uploading its data to the manufacturer, then deleting itself after a plane crash.

replies(1): >>45063612 #
3. buran77 ◴[] No.45063611[source]
The process of collecting and uploading the data probably confuses a lot of non-technical readers even if it worked as per standard industry practices.

The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted. There's no technical reason to dispose of data related to a crash when you hold so much data on all of the cars in general.

Crash data in particular should be considered sacred, especially given the severity in this case. Ideally it should be kept both on the local black box and on the servers. But anything that leads to it being treated as instantly disposable everywhere, or even just claiming it was deleted, can only be malice.

replies(2): >>45063918 #>>45064063 #
4. wat10000 ◴[] No.45063612[source]
I’ll bet another ten bucks that this is a generic implementation for all of their telemetry, not something special cased for crashes.

Deleting the data on the server is totally sketchy, but that’s not what the quoted section is about.

replies(2): >>45064051 #>>45065073 #
5. alistairSH ◴[] No.45063712[source]
That might be the case but the article seems to indicate the system knew the data was generated from an accident. So, removing to save space on the car should now be a secondary concern.
6. joshcryer ◴[] No.45063718[source]
The problem with this is that it destroys any chain of evidence. Tesla "lost" this data, in fact. You would never want your "black box" in your car delete itself after uploading to some service because the service could go down, be hacked, or the provider could decide to withhold it, forcing you into a lengthy discovery / custody battle.

This data is yours. You were going the speed limit when the accident happened and everyone else claims you were speeding. It would take forever to clear your name or worse you could be convicted if the data was lost.

This is more of "you will own nothing" crap. And mainly so Tesla can cover its ass.

7. aredox ◴[] No.45063727[source]
It is a car. A vehicule which can be involved in a fatal accident. It is not a website. There is no "oversight", nor is it "pretty standard" to do it like that: when you don't think about what your system is actually doing (and that is the most charitable explanation), YOU ARE STILL RESPONSIBLE AS IF YOU HAD DONE IT ON PURPOSE.
replies(1): >>45063780 #
8. wat10000 ◴[] No.45063780[source]
One of Tesla’s things is that their software is built by software people rather than by car people. This has advantages and disadvantages.

Maybe this is not appropriate for a car, but that doesn’t excuse the ridiculous breathless tone in the quoted text. It’s the worst purple prose making a boring system sound exciting and nefarious. They could have made your point without trying to make the unlink() call sound suspicious.

replies(4): >>45063958 #>>45064225 #>>45064231 #>>45064627 #
9. wat10000 ◴[] No.45063918[source]
> The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted.

Exactly. The issue is deleting the data on the servers, not a completely mundane upload-then-delete procedure for phoning home. This should have been one sentence, but instead they make it read like a heist.

10. buran77 ◴[] No.45063958{3}[source]
> their software is built by software people rather than by car people

The rogue engineer defense worked so well for VW and Dieselgate.

The issue of missing crash data was raised repeatedly. Deleting or even just claiming it was deleted can only be a mistake the first time.

replies(1): >>45065786 #
11. dylan604 ◴[] No.45064051{3}[source]
How handling an automobile crash not as a special case is the weird part. Even in the <$50 dashcams from Amazon there is a feature to mark a recording as locked so the auto delete logic does not touch the locked file. Some of them even have automatic collision detection which locks the file for you.

How Tesla could say that detecting a collision and not locking all/any of the data is normal is just insane.

replies(1): >>45064777 #
12. giancarlostoro ◴[] No.45064063[source]
> The real issue is that Tesla claimed the company doesn't have the data after every copy was deleted. There's no technical reason to dispose of data related to a crash when you hold so much data on all of the cars in general.

My money is on nobody built a tool to look up the data, so they have it, they just can't easily find it.

replies(1): >>45064276 #
13. const_cast ◴[] No.45064225{3}[source]
There are software people who know what they're doing - some write flight software or medical equipment software. They know how to critically think about the processes of their systems in detail.

So either the problem is Tesla engineers are fucking stupid (doubtful) or this is a poor business/product design.

My money is on the latter.

14. throwway120385 ◴[] No.45064231{3}[source]
I'm a software person but I still take the car person approach when I know i'm building a car. You have a responsibility to understand the gravity of the enterprise you undertake and to take appropriate steps given that gravity. Ignorance shouldn't be a defense, and if you don't know what you don't know then god help you.
15. ◴[] No.45064276{3}[source]
16. kergonath ◴[] No.45064627{3}[source]
> One of Tesla’s things is that their software is built by software people rather than by car people. This has advantages and disadvantages.

So we just shrug because software boys gotta be software boys? That’s completely insane and a big reason why a lot of engineers roll their eyes about developers who want to be considered engineers.

Software engineers who work on projects that can kill people must act like the lives of other people depend on them doing their job seriously, because that is the case. Look at the aviation industry. Is it acceptable to have a bug in the avionics suite down planes at random and then delete the black boxes? It absolutely is not, and when anything like that happens shit gets serious (think 737 MAX).

The developers who designed the systems are responsible, and so are their managers who approved the changes, all the way to the top. This would not happen in a company with appropriate processes in place.

replies(1): >>45066067 #
17. immibis ◴[] No.45064777{4}[source]
That one's easy: nobody at Tesla cares about having this feature
18. pjob ◴[] No.45065073{3}[source]
That might not be a good bet. https://news.ycombinator.com/item?id=45063380
replies(1): >>45066085 #
19. wat10000 ◴[] No.45065786{4}[source]
I really should know better than to think that I can criticize a small part of an article without a bunch of people thinking that I'm defending everything the article discusses.
20. wat10000 ◴[] No.45066067{4}[source]
I completely agree about responsibility for life-critical systems. I wouldn't put this in that category, though. Even on airliners, black boxes aren't treated quite as critically as the stuff that'll kill you then and there. Consider the recent crash in Korea where the black box shut off because it was designed without any backup power if the engines failed, or the Alaska Airlines flight where the voice recording was overwritten because it wasn't shut off after landing.

I'd argue that this data is far less important in cars. Airline safety has advanced to the point where crashes are extremely rare and usually have a novel cause. Data recorders are important to be able to learn that cause and figure out how to prevent it from happening again. Car safety, on the other hand, is shit. We don't require rigorous training for the operators. Regulations are lax, and enforcement even more lax. Infrastructure is poor. We're unwilling to fix these things. Almost all safety efforts focus on making the vehicles more robust when collisions occur, and we're just starting to see some effort put into making the vehicles automatically avoid some collisions. What are we going to learn from this data in cars? "Driver didn't stop for a red light, hit cross traffic." "Driver was drunk." "Driver failed to see pedestrian because of bad intersection design which has been known for fifty years and never been fixed." It's useful for assigning liability but not very useful for saving lives. There's a ton of lower hanging fruit to go after before you start combing through vehicle telemetry to find unknown problems.

Even if you do consider it to be life-critical, uploading the data and then deleting the local copy once receipt is acknowledged seems completely fine, if the server infrastructure is solid. Better than only keeping a local copy, even. The issue there is that they either have inadequate controls allowing data to be deleted, or inadequate ability to retrieve data.

21. wat10000 ◴[] No.45066085{4}[source]
I don't see anything in that comment that would apply to what I said.