←back to thread

398 points djoldman | 1 comments | | HN request time: 0.305s | source
Show context
lukev ◴[] No.42071345[source]
There's something missing from this discussion.

What really matters isn't how secure this is on an absolute scale, or how much one can trust Apple.

Rather, we should weigh this against what other cloud providers offer.

The status quo for every other provider is: "this data is just lying around on our servers. The only thing preventing a employee from accessing it is that it would be a violation of policy (and might be caught in an internal audit.)" Most providers also carve out several cases where they can look at your data, for support, debugging, or analytics purposes.

So even though the punchline of "you still need to trust Apple" is technically true, this is qualitatively different because what would need to occur for Apple to break their promises here is so much more drastic. For other services to leak their data, all it takes is for one employee to do something they shouldn't. For Apple, it would require a deliberate compromise of the entire stack at the hardware level.

This is very much harder to pull off, and more difficult to hide, and therefore Apple's security posture is qualitatively better than Google, Meta or Microsoft.

If you want to keep your data local and trust no-one, sure, fine, then you don't need to trust anyone else at all. But presuming you (a) are going to use cloud services and (b) you care about privacy, Apple has a compelling value proposition.

replies(7): >>42072229 #>>42073673 #>>42073693 #>>42074841 #>>42075160 #>>42075432 #>>42078451 #
harry8 ◴[] No.42072229[source]
> Apple has a compelling value proposition.

No. Apple has a proposition that /may/ be better than the current alternatives?

replies(1): >>42073046 #
lukev ◴[] No.42073046[source]
If Apple is doing what they say they are, it is in fact better. No maybe about it.

If they’re not, that means they are acting and intentionally deceiving the public security community which they are inviting to audit it.

Is that something you actually think is happening? I think we need to be clear here.

Your threat model may or may not be covered by the guarantees they are able to document, but just saying “well maybe they’re still doing some unspecified nefarious thing” is not contributing to the discussion.

Especially when none of the alternatives are even trying.

replies(2): >>42073177 #>>42075506 #
talldayo ◴[] No.42073177[source]
Honestly I think this is a disingenuous defense. It's not insane to look at a closed-source project that is being partially-audited by cherrypicked organizations and say "that's not a very secure or trustworthy process". There is no reasonable accountability being offered to the community. It's like Ford selecting private safety inspectors to tell customers how great their safety is while conveniently leaving out any of the results from their federally-mandated crash tests. Is this really helping customers, or is it just blatant and masturbatory marketing?

Apple has worked to deceive the public before, in both small and large ways. They lied about backdooring notifications for the US government when they were asked to[0], so it's not too hard to imagine it happening anywhere else in their systems. They're not taking a traditional approach to software transparency which is suspicious, and their "threat model" has professedly not protected against motivated requests for identifying information[1].

When the Mechanical Turk attempted to fool commoners watching it work, it was imperative to hide every trace of the human inside. The candle used to see inside the machine was masked by smoke from candles placed around the room, the cabinet was locked to avoid accidental opening, and people were told not to touch it because it was apparently 'expensive and fragile'. Looks like Apple is the ringleader this time around.

> but just saying “well maybe they’re still doing some unspecified nefarious thing” is not contributing to the discussion.

But Apple is saying the opposite, "well, maybe we're doing the detailed secure thing, ask these people we hired", and you're praising them for it. If calling out objective and obvious logical fallacies isn't contribution, then how are we supposed to argue inside the Reality Distortion Field? Do we make-believe and assume that Apple's provided preconditions are true, or can we express concerns for the outstanding issues? I don't understand how these real-world flaws are somehow unjustified in conversation. You're allowed to hold Apple to adversarial levels of scrutiny if you take security seriously.

> Especially when none of the alternatives are even trying.

Apple is the largest company in the world and by many metrics (and points of comparison) isn't even doing the bare minimum in managing public trust. Whenever you are shown a whitepaper without the means to validate the contents yourself, you are being fed what is called "marketing" in the tech circles. You don't have to feel bad about being tricked though, it's the same thing that fools investors and overly-faithful engineers. Whitepapers are whitepapers, handpicked security audits are handpicked security audits, and code is code. There is no blurring of the lines.

[0] https://arstechnica.com/tech-policy/2023/12/apple-admits-to-...

[1] https://www.apple.com/legal/transparency/us.html

replies(3): >>42073255 #>>42076310 #>>42078391 #
1. abalone ◴[] No.42078391[source]
Your concern is the hardware auditors are not trustworthy because Apple hired them?

I mean that’s fair but I don’t think the goal here is to offer that level of guarantee. For example their ceremony involves people from 3 other Apple organizational units, plus the auditor. It’s mostly Apple doing the certification. They’re not trying to guard too heavily against the “I don’t trust Apple is trying to fool me” concern.

What this does protect you from is stuff like a rogue internal actor, software vulnerability, or government subpoena. The PCC nodes are “airtight” and provably do not retain or share data. This is auditable by the whole security community and clients can verify they are communicating with an auditable binary. It’s not just a white paper.

That’s an enormous step up from the status quo.