←back to thread

650 points clcaev | 1 comments | | HN request time: 0s | source
Show context
metaphor ◴[] No.45063162[source]
> Immediately after the wreck at 9:14 p.m. on April 25, 2019, the crucial data detailing how it unfolded was automatically uploaded to the company’s servers and stored in a vast central database, according to court documents. Tesla’s headquarters soon sent an automated message back to the car confirming that it had received the collision snapshot.

> Moments later, court records show, the data was just as automatically “unlinked” from the 2019 Tesla Model S at the scene, meaning the local copy was marked for deletion, a standard practice for Teslas in such incidents, according to court testimony.

Wow...just wow.

replies(5): >>45063302 #>>45063632 #>>45063687 #>>45063980 #>>45064115 #
A4ET8a8uTh0_v2 ◴[] No.45063302[source]
I am trying to imagine a scenario under which that is defensible and does not raise various questions including compliance, legal, retention. Not to mention, who were the people who put that code into production knowing it would do that.

edit: My point is that it was not one lone actor, who would have made that change.

replies(3): >>45063366 #>>45063389 #>>45064252 #
colejohnson66 ◴[] No.45063366[source]
Assuming no malice, I'd guess it's for space saving on the car's internal memory. If the data was uploaded off of the car, there’s no point keeping it in the car.
replies(5): >>45063520 #>>45063627 #>>45064037 #>>45064183 #>>45065363 #
wat10000 ◴[] No.45063520[source]
Sounds like a pretty standard telemetry upload. You transmit it, keep your copy until you get acknowledgement that it was received so you can retry if it went wrong, then delete it when it succeeds.

It’s just worded to make this sound sketchy. I bet ten bucks “unlinked” just refers to the standard POSIX call for deleting a file.

replies(5): >>45063580 #>>45063611 #>>45063712 #>>45063718 #>>45063727 #
aredox ◴[] No.45063727[source]
It is a car. A vehicule which can be involved in a fatal accident. It is not a website. There is no "oversight", nor is it "pretty standard" to do it like that: when you don't think about what your system is actually doing (and that is the most charitable explanation), YOU ARE STILL RESPONSIBLE AS IF YOU HAD DONE IT ON PURPOSE.
replies(1): >>45063780 #
wat10000 ◴[] No.45063780{3}[source]
One of Tesla’s things is that their software is built by software people rather than by car people. This has advantages and disadvantages.

Maybe this is not appropriate for a car, but that doesn’t excuse the ridiculous breathless tone in the quoted text. It’s the worst purple prose making a boring system sound exciting and nefarious. They could have made your point without trying to make the unlink() call sound suspicious.

replies(4): >>45063958 #>>45064225 #>>45064231 #>>45064627 #
kergonath ◴[] No.45064627{4}[source]
> One of Tesla’s things is that their software is built by software people rather than by car people. This has advantages and disadvantages.

So we just shrug because software boys gotta be software boys? That’s completely insane and a big reason why a lot of engineers roll their eyes about developers who want to be considered engineers.

Software engineers who work on projects that can kill people must act like the lives of other people depend on them doing their job seriously, because that is the case. Look at the aviation industry. Is it acceptable to have a bug in the avionics suite down planes at random and then delete the black boxes? It absolutely is not, and when anything like that happens shit gets serious (think 737 MAX).

The developers who designed the systems are responsible, and so are their managers who approved the changes, all the way to the top. This would not happen in a company with appropriate processes in place.

replies(1): >>45066067 #
1. wat10000 ◴[] No.45066067{5}[source]
I completely agree about responsibility for life-critical systems. I wouldn't put this in that category, though. Even on airliners, black boxes aren't treated quite as critically as the stuff that'll kill you then and there. Consider the recent crash in Korea where the black box shut off because it was designed without any backup power if the engines failed, or the Alaska Airlines flight where the voice recording was overwritten because it wasn't shut off after landing.

I'd argue that this data is far less important in cars. Airline safety has advanced to the point where crashes are extremely rare and usually have a novel cause. Data recorders are important to be able to learn that cause and figure out how to prevent it from happening again. Car safety, on the other hand, is shit. We don't require rigorous training for the operators. Regulations are lax, and enforcement even more lax. Infrastructure is poor. We're unwilling to fix these things. Almost all safety efforts focus on making the vehicles more robust when collisions occur, and we're just starting to see some effort put into making the vehicles automatically avoid some collisions. What are we going to learn from this data in cars? "Driver didn't stop for a red light, hit cross traffic." "Driver was drunk." "Driver failed to see pedestrian because of bad intersection design which has been known for fifty years and never been fixed." It's useful for assigning liability but not very useful for saving lives. There's a ton of lower hanging fruit to go after before you start combing through vehicle telemetry to find unknown problems.

Even if you do consider it to be life-critical, uploading the data and then deleting the local copy once receipt is acknowledged seems completely fine, if the server infrastructure is solid. Better than only keeping a local copy, even. The issue there is that they either have inadequate controls allowing data to be deleted, or inadequate ability to retrieve data.