←back to thread

295 points djoldman | 3 comments | | HN request time: 0.752s | source
Show context
jagrsw ◴[] No.42062732[source]
If Apple controls the root of trust, like the private keys in the CPU or security processor used to check the enclave (similar to how Intel and AMD do it with SEV-SNP and TDX), then technically, it's a "trust us" situation, since they likely use their own ARM silicon for that?

Harder to attack, sure, but no outside validation. Apple's not saying "we can't access your data," just "we're making it way harder for bad guys (and rogue employees) to get at it."

replies(5): >>42062974 #>>42063040 #>>42063051 #>>42064261 #>>42065655 #
skylerwiernik ◴[] No.42063040[source]
I don't think they do. Your phone cryptographically verifies that the software running on the servers is what it says it is, and you can't pull the keys out of the secure enclave. They also had independent auditors go over the whole thing and publish a report. If the chip is disconnected from the system it will dump its keys and essentially erase all data.
replies(3): >>42063402 #>>42063626 #>>42065085 #
1. plagiarist ◴[] No.42063402[source]
I don't understand how publishing cryptographic signatures of the software is a guarantee? How do they prove it isn't keeping a copy of the code to make signatures from but actually running a malicious binary?
replies(1): >>42065128 #
2. dialup_sounds ◴[] No.42065128[source]
The client will only talk to servers that can prove they're running the same software as the published signatures.

https://security.apple.com/documentation/private-cloud-compu...

replies(1): >>42066073 #
3. warkdarrior ◴[] No.42066073[source]
And the servers prove that by relying on a key stored in secure hardware. And that secure hardware is designed by Apple, who has a specific interest in convincing users of that attestation/proof. Do you see the conflict of interest now?