←back to thread

333 points djoldman | 7 comments | | HN request time: 0s | source | bottom
Show context
lukev ◴[] No.42071345[source]
There's something missing from this discussion.

What really matters isn't how secure this is on an absolute scale, or how much one can trust Apple.

Rather, we should weigh this against what other cloud providers offer.

The status quo for every other provider is: "this data is just lying around on our servers. The only thing preventing a employee from accessing it is that it would be a violation of policy (and might be caught in an internal audit.)" Most providers also carve out several cases where they can look at your data, for support, debugging, or analytics purposes.

So even though the punchline of "you still need to trust Apple" is technically true, this is qualitatively different because what would need to occur for Apple to break their promises here is so much more drastic. For other services to leak their data, all it takes is for one employee to do something they shouldn't. For Apple, it would require a deliberate compromise of the entire stack at the hardware level.

This is very much harder to pull off, and more difficult to hide, and therefore Apple's security posture is qualitatively better than Google, Meta or Microsoft.

If you want to keep your data local and trust no-one, sure, fine, then you don't need to trust anyone else at all. But presuming you (a) are going to use cloud services and (b) you care about privacy, Apple has a compelling value proposition.

replies(3): >>42072229 #>>42073673 #>>42073693 #
harry8 ◴[] No.42072229[source]
> Apple has a compelling value proposition.

No. Apple has a proposition that /may/ be better than the current alternatives?

replies(1): >>42073046 #
lukev ◴[] No.42073046[source]
If Apple is doing what they say they are, it is in fact better. No maybe about it.

If they’re not, that means they are acting and intentionally deceiving the public security community which they are inviting to audit it.

Is that something you actually think is happening? I think we need to be clear here.

Your threat model may or may not be covered by the guarantees they are able to document, but just saying “well maybe they’re still doing some unspecified nefarious thing” is not contributing to the discussion.

Especially when none of the alternatives are even trying.

replies(1): >>42073177 #
1. talldayo ◴[] No.42073177[source]
Honestly I think this is a disingenuous defense. It's not insane to look at a closed-source project that is being partially-audited by cherrypicked organizations and say "that's not a very secure or trustworthy process". There is no reasonable accountability being offered to the community. It's like Ford selecting private safety inspectors to tell customers how great their safety is while conveniently leaving out any of the results from their federally-mandated crash tests. Is this really helping customers, or is it just blatant and masturbatory marketing?

Apple has worked to deceive the public before, in both small and large ways. They lied about backdooring notifications for the US government when they were asked to[0], so it's not too hard to imagine it happening anywhere else in their systems. They're not taking a traditional approach to software transparency which is suspicious, and their "threat model" has professedly not protected against motivated requests for identifying information[1].

When the Mechanical Turk attempted to fool commoners watching it work, it was imperative to hide every trace of the human inside. The candle used to see inside the machine was masked by smoke from candles placed around the room, the cabinet was locked to avoid accidental opening, and people were told not to touch it because it was apparently 'expensive and fragile'. Looks like Apple is the ringleader this time around.

> but just saying “well maybe they’re still doing some unspecified nefarious thing” is not contributing to the discussion.

But Apple is saying the opposite, "well, maybe we're doing the detailed secure thing, ask these people we hired", and you're praising them for it. If calling out objective and obvious logical fallacies isn't contribution, then how are we supposed to argue inside the Reality Distortion Field? Do we make-believe and assume that Apple's provided preconditions are true, or can we express concerns for the outstanding issues? I don't understand how these real-world flaws are somehow unjustified in conversation. You're allowed to hold Apple to adversarial levels of scrutiny if you take security seriously.

> Especially when none of the alternatives are even trying.

Apple is the largest company in the world and by many metrics (and points of comparison) isn't even doing the bare minimum in managing public trust. Whenever you are shown a whitepaper without the means to validate the contents yourself, you are being fed what is called "marketing" in the tech circles. You don't have to feel bad about being tricked though, it's the same thing that fools investors and overly-faithful engineers. Whitepapers are whitepapers, handpicked security audits are handpicked security audits, and code is code. There is no blurring of the lines.

[0] https://arstechnica.com/tech-policy/2023/12/apple-admits-to-...

[1] https://www.apple.com/legal/transparency/us.html

replies(1): >>42073255 #
2. lukev ◴[] No.42073255[source]
I mean they’re making the code public and inviting external auditors. There’s literally nothing else they can do as a private company. What evidence of the integrity could possibly satisfy you?

And again, the benchmark isn’t “theoretically perfect”, because I agree this isn’t. The benchmark is “other cloud providers” and they are either lying through their teeth or categorically better.

replies(1): >>42073292 #
3. talldayo ◴[] No.42073292[source]
They're making the promises public. The code is being shown to selected individuals deemed suitable and then they're telling me by way of proxy. That's nonsense, show us the code if it's that easy to give to others. Anything else is suspiciously furtive.

> The benchmark is “other cloud providers” and they are either lying through their teeth or categorically better.

Other cloud providers aren't part of PRISM and generally don't receive the same level of concern from the world governments. They can afford to resist legal demands from countries they don't respect because they have nothing to lose from denying them access. Apple has not demonstrated that they have the willingness to resist this unlawful coercion, even recently. If they lied about the Push Notification security spec, what's to stop them from lying about this one too?

replies(3): >>42073356 #>>42073754 #>>42073916 #
4. rwiggins ◴[] No.42073356{3}[source]
> show us the code if it's that easy to give to others

See https://security.apple.com/documentation/private-cloud-compu...

disclosure: work at Apple, opinions absolutely my own.

replies(1): >>42073591 #
5. saagarjha ◴[] No.42073591{4}[source]
The fancy security properties they’re talking about rely on a whole lot of closed source code not included there. Though an Apple intern did “donate” some of it to the public years ago.
6. tharant ◴[] No.42073754{3}[source]
> Other cloud providers aren't part of PRISM and generally don't receive the same level of concern from the world governments.

Um, Microsoft, Google, Meta, Yahoo, YouTube, Skype, and AOL are/were PRISM Service Providers and I’d argue that they all receive(d) equal (+/- 5%) concern and scrutiny from those world governments.

> They can afford to resist legal demands from countries they don't respect because they have nothing to lose from denying them access.

Are you talking about the cloud providers I listed above? From my perspective, those guys all tend to honor the demands of any state that offers a statistically significant percentage of current/potential consumers, regardless of the demand. Perhaps they have some bright spots where they “did the right thing” (like refusing to unlock a device, or refusing to provide access to private data) but by and large they all—including Apple—are subject to the rules of the states within which they operate.

> Apple has not demonstrated that they have the willingness to resist this unlawful coercion, even recently.

Ten years ago, Apple refused demands by the FBI to unlock the iPhones of various suspects. Four years ago they did the same during the Pensacola Naval Base shooting investigation. I would guess there’s plenty of other examples but I’ve not been watching that stuff much over the past couple years. Were those instances just cherry-picked for marketing purposes? Maybe, but until someone shows me compelling evidence that Apple is /not/ acting in good faith towards both their consumers and the governments under which they operate, I see no reason to believe that they’re “lying about this one too”.

I do keep a salt-encrusted spoon nearby when reading about these things but that doesn’t mean I refuse to trust someone who has demonstrated what appears to me a good-faith effort to keep my privacy intact. Maybe what Apple is doing with PCC is just security theater; I doubt it but I also recognize that marketing and technology are often in conflict so we must always be cautious. But the important thing, both to me and GP, is that none of the other cloud providers have offered (whether it be sane and intelligent privacy controls or just snake oil-like scams) any solution beyond “encrypt your data before you upload it to the cloud”.

7. johnklos ◴[] No.42073916{3}[source]
This has a ring of the same arguments made by flat Earthers. You could offer to take one to near space and show them things, but then every other one will stop believing that one person, so you're expected that unless you can take all of them to near space, you can't "prove" what you're trying to prove.

Your argument isn't far off from saying that Apple will collude with lots of security researchers, and because you're not invited to the party, nobody can prove that you're wrong. Oversimplification, yes, but basically true.