←back to thread

650 points clcaev | 3 comments | | HN request time: 0.001s | source
Show context
fabian2k ◴[] No.45063298[source]
Do I understand it correctly? Crash data gets automatically transmitted to Tesla, and after it was transmitted is immediately marked for deletion?

If that is actually designed like this, the only reason I could see for it would be so that Tesla has sole access to the data and can decide whether to use it or not. Which really should not work in court, but it seems it has so far.

And of course I'd expect an audit trail for the deletion of crash data on Tesla servers. But who knows whether there actually isn't one, or nobody looked into it at all.

replies(7): >>45063548 #>>45063617 #>>45064088 #>>45064532 #>>45065580 #>>45067599 #>>45069859 #
lgeorget ◴[] No.45063617[source]
I guess one charitable way to look at it is that after a crash, external people could get access to the car and its memory, which could potentially expose private data about the owner/driver. And besides private data, if data about the car condition was leaked to the public, it could be made to say anything depending on who presents it and how, so it's safer for the investigation if only appointed experts in the field have access to it.

This is not unlike what happens for flight data recorders after a crash. The raw data is not made public right away, if ever.

replies(2): >>45063651 #>>45063981 #
fabian2k ◴[] No.45063651[source]
If Tesla securely stored this data and reliably turned it over to the authorities, I wouldn't argue much with this.

But the data was mostly unprotected on the devices, or it couldn't have been restored. And Tesla isn't exactly known for respecting the privacy of their customers, they have announced details about accidents publicly before.

And there is the potential conflict of interest, Tesla does have strong incentives to "lose" data that implicates Autopilot or FSD.

replies(1): >>45063764 #
sanex ◴[] No.45063764[source]
I would rather my cars not automatically rat me out to the authorities, personally.
replies(5): >>45064121 #>>45064171 #>>45066355 #>>45067755 #>>45072567 #
souterrain ◴[] No.45064171{3}[source]
Your property isn't ratting you out. The software you license from Tesla is ratting you out.
replies(1): >>45064380 #
salawat ◴[] No.45064380{4}[source]
Such a pity there is no way to get an electronics minimal car control unit. Funny how conspicuously unimplemented functionality works.
replies(2): >>45066277 #>>45067141 #
1. MetaWhirledPeas ◴[] No.45066277{5}[source]
When you go to an electrical drive train you quickly realize you need computers for things like battery conditioning, efficiency, forward/reverse, charging, route planning, stop/start, and on and on and on. It's not as simple as engine on, engine off. Tesla (rightly, IMO) chose to lean into this. It will be interesting to see what a company like Slate chooses to do.
replies(1): >>45067086 #
2. salawat ◴[] No.45067086[source]
Note I said minimal. If manufacturers were content to just restrain integrated circuits to those purposes without widespread telemetry or phoning home, or creating software lockouts we'd meet my definition of minimal. Just what it takes to make a functioning device. Instead, we see software used as load bearing supports for predatory or exploitative/surveillance oriented architectures. That is not minimal to me.
replies(1): >>45068363 #
3. SR2Z ◴[] No.45068363[source]
IMO the rules should be simple: manufacturers of electronics need to be required to provide private keys for the electronics, plus a source-available MVP firmware for getting the thing to work.

I don't care if GM or whoever wants to ship a buggy, ad-ridden, data-siphoning, subscription filled nightmare with new cars. That's their decision. But they should be banned from trying to exercise any kind of control over a piece of hardware that I own outright.