Most active commenters

    ←back to thread

    398 points djoldman | 22 comments | | HN request time: 2.098s | source | bottom
    1. jagrsw ◴[] No.42062732[source]
    If Apple controls the root of trust, like the private keys in the CPU or security processor used to check the enclave (similar to how Intel and AMD do it with SEV-SNP and TDX), then technically, it's a "trust us" situation, since they likely use their own ARM silicon for that?

    Harder to attack, sure, but no outside validation. Apple's not saying "we can't access your data," just "we're making it way harder for bad guys (and rogue employees) to get at it."

    replies(6): >>42062974 #>>42063040 #>>42063051 #>>42064261 #>>42065655 #>>42078881 #
    2. ozgune ◴[] No.42062974[source]
    +1 on your comment.

    I think having a description of Apple's threat model would help.

    I was thinking that open source would help with their verifiable privacy promise. Then again, as you've said, if Apple controls the root of trust, they control everything.

    replies(2): >>42063861 #>>42063907 #
    3. skylerwiernik ◴[] No.42063040[source]
    I don't think they do. Your phone cryptographically verifies that the software running on the servers is what it says it is, and you can't pull the keys out of the secure enclave. They also had independent auditors go over the whole thing and publish a report. If the chip is disconnected from the system it will dump its keys and essentially erase all data.
    replies(4): >>42063402 #>>42063626 #>>42065085 #>>42073964 #
    4. ant_li0n ◴[] No.42063051[source]
    Hey can you help me understand what you mean? There's an entry about "Hardware Root of Trust" in that document, but I don't see how that means Apple is avoiding stating, "we can't access your data" - the doc says it's not exportable.

    "Explain it like I'm a lowly web dev"

    replies(1): >>42063396 #
    5. jolan ◴[] No.42063396[source]
    https://x.com/_saagarjha/status/1804130898482466923

    https://x.com/frogandtoadbook/status/1734575421792920018

    6. plagiarist ◴[] No.42063402[source]
    I don't understand how publishing cryptographic signatures of the software is a guarantee? How do they prove it isn't keeping a copy of the code to make signatures from but actually running a malicious binary?
    replies(1): >>42065128 #
    7. HeatrayEnjoyer ◴[] No.42063626[source]
    How do you know the root enclave key isn't retained somewhere before it is written? You're still trusting Apple.

    Key extraction is difficult but not impossible.

    replies(3): >>42063692 #>>42067336 #>>42078961 #
    8. jsheard ◴[] No.42063692{3}[source]
    > Key extraction is difficult but not impossible.

    Refer to the never-ending clown show that is Intels SGX enclave for examples of this.

    https://en.wikipedia.org/wiki/Software_Guard_Extensions#List...

    9. bootsmann ◴[] No.42063861[source]
    They define their threat model in "Anticipating Attacks"
    10. dagmx ◴[] No.42063907[source]
    Their threat model is described in their white papers.

    But essentially it is trying to get to the end result of “if someone commandeers the building with the servers, they still can’t compromise the data chain even with physical access”

    11. wutwutwat ◴[] No.42064261[source]
    every entity you hand data to other than yourself is a "trust us" situation
    replies(1): >>42067240 #
    12. hnaccount_rng ◴[] No.42065085[source]
    But since they also control the phone's operating system they can just make it lie to you!

    That doesn't make PCC useless by the way. It clearly establishes that Apple mislead customers, if there is any intentionality in a breach, or that Apple was negligent, if they do not immediately provide remedies on notification of a breach. But that's much more a "raising the cost" kind of thing and not a technical exclusion. Yes if you get Apple, as an organisation, to want to get at your data. And you use an iPhone. They absolutely can.

    13. dialup_sounds ◴[] No.42065128{3}[source]
    The client will only talk to servers that can prove they're running the same software as the published signatures.

    https://security.apple.com/documentation/private-cloud-compu...

    replies(1): >>42066073 #
    14. SheinhardtWigCo ◴[] No.42065655[source]
    It was always "trust us". They make the silicon, and you have no hope of meaningfully reverse engineering it. Plus, iOS and macOS have silent software update mechanisms, and no update transparency.
    15. warkdarrior ◴[] No.42066073{4}[source]
    And the servers prove that by relying on a key stored in secure hardware. And that secure hardware is designed by Apple, who has a specific interest in convincing users of that attestation/proof. Do you see the conflict of interest now?
    16. fsflover ◴[] No.42067240[source]
    Unless it's encrypted.
    replies(1): >>42070049 #
    17. yalogin ◴[] No.42067336{3}[source]
    Can you clarify what you mean by retained and written?
    18. wutwutwat ◴[] No.42070049{3}[source]
    you trust more than I do
    replies(1): >>42077689 #
    19. lmm ◴[] No.42073964[source]
    > you can't pull the keys out of the secure enclave.

    You or I can't, but that doesn't mean Apple can't. They made that enclave, and put the keys in it in the first place.

    20. fsflover ◴[] No.42077689{4}[source]
    How so?
    21. abalone ◴[] No.42078881[source]
    > Harder to attack, sure, but no outside validation.

    There is actually a third party auditor involved in certifying hardware integrity prior to deployment.[1]

    But yes, the goal is to protect against rogue agents and hackers (and software bugs!), not to prove that Apple as an organization has fundamentally designed backdoors into the secure element of their silicon.

    [1] https://security.apple.com/documentation/private-cloud-compu...

    22. abalone ◴[] No.42078961{3}[source]
    According to Apple,

    "A randomly generated UID is fused into the SoC at manufacturing time. Starting with A9 SoCs, the UID is generated by the Secure Enclave TRNG during manufacturing and written to the fuses using a software process that runs entirely in the Secure Enclave. This process protects the UID from being visible outside the device during manufacturing and therefore isn’t available for access or storage by Apple or any of its suppliers."[1]

    But yes of course, you have to trust the manufacturer is not lying to you. PCC is about building on top of that fundamental trust to guard against a whole variety of other attacks.

    [1] https://support.apple.com/guide/security/secure-enclave-sec5...