←back to thread

398 points djoldman | 4 comments | | HN request time: 1.015s | source
Show context
lukev ◴[] No.42071345[source]
There's something missing from this discussion.

What really matters isn't how secure this is on an absolute scale, or how much one can trust Apple.

Rather, we should weigh this against what other cloud providers offer.

The status quo for every other provider is: "this data is just lying around on our servers. The only thing preventing a employee from accessing it is that it would be a violation of policy (and might be caught in an internal audit.)" Most providers also carve out several cases where they can look at your data, for support, debugging, or analytics purposes.

So even though the punchline of "you still need to trust Apple" is technically true, this is qualitatively different because what would need to occur for Apple to break their promises here is so much more drastic. For other services to leak their data, all it takes is for one employee to do something they shouldn't. For Apple, it would require a deliberate compromise of the entire stack at the hardware level.

This is very much harder to pull off, and more difficult to hide, and therefore Apple's security posture is qualitatively better than Google, Meta or Microsoft.

If you want to keep your data local and trust no-one, sure, fine, then you don't need to trust anyone else at all. But presuming you (a) are going to use cloud services and (b) you care about privacy, Apple has a compelling value proposition.

replies(7): >>42072229 #>>42073673 #>>42073693 #>>42074841 #>>42075160 #>>42075432 #>>42078451 #
harry8 ◴[] No.42072229[source]
> Apple has a compelling value proposition.

No. Apple has a proposition that /may/ be better than the current alternatives?

replies(1): >>42073046 #
lukev ◴[] No.42073046[source]
If Apple is doing what they say they are, it is in fact better. No maybe about it.

If they’re not, that means they are acting and intentionally deceiving the public security community which they are inviting to audit it.

Is that something you actually think is happening? I think we need to be clear here.

Your threat model may or may not be covered by the guarantees they are able to document, but just saying “well maybe they’re still doing some unspecified nefarious thing” is not contributing to the discussion.

Especially when none of the alternatives are even trying.

replies(2): >>42073177 #>>42075506 #
talldayo ◴[] No.42073177[source]
Honestly I think this is a disingenuous defense. It's not insane to look at a closed-source project that is being partially-audited by cherrypicked organizations and say "that's not a very secure or trustworthy process". There is no reasonable accountability being offered to the community. It's like Ford selecting private safety inspectors to tell customers how great their safety is while conveniently leaving out any of the results from their federally-mandated crash tests. Is this really helping customers, or is it just blatant and masturbatory marketing?

Apple has worked to deceive the public before, in both small and large ways. They lied about backdooring notifications for the US government when they were asked to[0], so it's not too hard to imagine it happening anywhere else in their systems. They're not taking a traditional approach to software transparency which is suspicious, and their "threat model" has professedly not protected against motivated requests for identifying information[1].

When the Mechanical Turk attempted to fool commoners watching it work, it was imperative to hide every trace of the human inside. The candle used to see inside the machine was masked by smoke from candles placed around the room, the cabinet was locked to avoid accidental opening, and people were told not to touch it because it was apparently 'expensive and fragile'. Looks like Apple is the ringleader this time around.

> but just saying “well maybe they’re still doing some unspecified nefarious thing” is not contributing to the discussion.

But Apple is saying the opposite, "well, maybe we're doing the detailed secure thing, ask these people we hired", and you're praising them for it. If calling out objective and obvious logical fallacies isn't contribution, then how are we supposed to argue inside the Reality Distortion Field? Do we make-believe and assume that Apple's provided preconditions are true, or can we express concerns for the outstanding issues? I don't understand how these real-world flaws are somehow unjustified in conversation. You're allowed to hold Apple to adversarial levels of scrutiny if you take security seriously.

> Especially when none of the alternatives are even trying.

Apple is the largest company in the world and by many metrics (and points of comparison) isn't even doing the bare minimum in managing public trust. Whenever you are shown a whitepaper without the means to validate the contents yourself, you are being fed what is called "marketing" in the tech circles. You don't have to feel bad about being tricked though, it's the same thing that fools investors and overly-faithful engineers. Whitepapers are whitepapers, handpicked security audits are handpicked security audits, and code is code. There is no blurring of the lines.

[0] https://arstechnica.com/tech-policy/2023/12/apple-admits-to-...

[1] https://www.apple.com/legal/transparency/us.html

replies(3): >>42073255 #>>42076310 #>>42078391 #
1. avianlyric ◴[] No.42076310[source]
> They lied about backdooring notifications for the US government when they were asked to[0]

That’s a bit much. They were compelled by the U.S. government to deny handing over data. Sure it’s technically a lie, in the same that a General stating they “neither confirm or deny X” is also likely a lie.

But it’s entirely unreasonable to judge Apple for following the legal, and mandated instructions issued by the democratically elected government of the nation they operate within. Are you honestly suggesting that companies like Apple should be expected to simply ignore the law when you think it’s convenient?

replies(1): >>42078050 #
2. talldayo ◴[] No.42078050[source]
> Are you honestly suggesting that companies like Apple should be expected to simply ignore the law when you think it’s convenient?

No, I am suggesting that none of you know what you're talking about when defending Apple's brand of privacy. We know that they can be compelled to lie to us about their compute architecture, so why accept half-measures in your security? Because Apple is a good company and deserves the respect???

replies(1): >>42078447 #
3. alsetmusic ◴[] No.42078447[source]
> We know that they can be compelled to lie to us about their compute architecture, so why accept half-measures in your security?

There's a difference between a court ordering preventing disclosure and compelling speech. The first amendment prevents compelling speech. They can be forced not to reveal. They can't be forced to make false claims.

replies(1): >>42079938 #
4. talldayo ◴[] No.42079938{3}[source]
Distinction without a difference, here. Apple's marketing already promised things they cannot guarantee, and instead of dropping the privacy shtick altogether they deliberately misconstrued their image to promote sales of Apple devices. The NSA didn't write the lines for them, but they also knew Apple wouldn't stop marketing privacy even if the CCP owned iCloud servers. Lying for marketing purposes is part of Apple's core identity.

Therein lies the problem. If you distort reality to cast a positive light on a service of dubious value, you're only going to drive out the knowledgeable users. This is how Apple killed FCP, Logic, Aperture, XServe, Metal and it's how they've driven out security experts too. Everyone serious about security got out of dodge years ago - the only people left are the sycophants who argue on the merit of whitepapers that cannot be validated. With Apple suing security researchers and neglecting their bug bounty program, it's no wonder we ended up in this situation. Companies like Cellebrite and Greykey can stock up on exploits because Apple doesn't take their security researchers seriously.