←back to thread

398 points djoldman | 1 comments | | HN request time: 0.209s | source
Show context
lukev ◴[] No.42071345[source]
There's something missing from this discussion.

What really matters isn't how secure this is on an absolute scale, or how much one can trust Apple.

Rather, we should weigh this against what other cloud providers offer.

The status quo for every other provider is: "this data is just lying around on our servers. The only thing preventing a employee from accessing it is that it would be a violation of policy (and might be caught in an internal audit.)" Most providers also carve out several cases where they can look at your data, for support, debugging, or analytics purposes.

So even though the punchline of "you still need to trust Apple" is technically true, this is qualitatively different because what would need to occur for Apple to break their promises here is so much more drastic. For other services to leak their data, all it takes is for one employee to do something they shouldn't. For Apple, it would require a deliberate compromise of the entire stack at the hardware level.

This is very much harder to pull off, and more difficult to hide, and therefore Apple's security posture is qualitatively better than Google, Meta or Microsoft.

If you want to keep your data local and trust no-one, sure, fine, then you don't need to trust anyone else at all. But presuming you (a) are going to use cloud services and (b) you care about privacy, Apple has a compelling value proposition.

replies(7): >>42072229 #>>42073673 #>>42073693 #>>42074841 #>>42075160 #>>42075432 #>>42078451 #
mattlondon ◴[] No.42075432[source]
Citation needed.

I know that there is a lot of work being done at the Big Cos to meet various regulations and conformance things to make sure that data is encrypted at rest and in transit with customer supplied keys, no unilateral access, end-to-end audit logging etc.

You don't win the big big big hundreds-of-millions government/military/finance/healthcare contracts without these sort of things. The Big Cos are not going to ignore those sorts of opportunities, and are obviously putting in the work with hundreds/thousands of engineers implementing the provably-secure nature of their products, from supply-chain to hardware to software to customer support access.

replies(2): >>42078140 #>>42078256 #
abalone ◴[] No.42078140[source]
PCC is fundamentally more secure than merely encrypting at rest and auditing access. That still has a variety of attack vectors such as a software bug that leaks data.

Apple is unable to access the data even if subpoenaed, for example, and this is provable via binary audits and client verification that they are communicating with an auditable node.

replies(1): >>42079103 #
mattlondon ◴[] No.42079103[source]
How is that any different in either direction? Bugs exist in any and all code. Encrypted data is unencryptable if you don't have the keys.

I don't see that apple software is any different in that regard (just try using Mac OS for any length of time even on apple silicon and you run out of fingers to count obvious UI bugs pretty quickly just in day to day usage). And obviously AWS won't be able to decrypt your data without your keys either.

The people running these huge multi-multi-billion clouds are not idiots making fundamental errors in security. This is why they all pay mega salaries for highly skilled people and offer five-figure bug bounties etc - they take this seriously. Would some random VPS or whatever be more likely to make errors like this, sure - but they are not in (and not expected to be) in the same league.

replies(1): >>42088060 #
1. mattlondon ◴[] No.42088060[source]
And just to confirm, less than 24 hours a post on HN here about a whole new batch of critical security bugs in Mac OS: https://jhftss.github.io/A-New-Era-of-macOS-Sandbox-Escapes/

I would not trust Apple any more or less than other other big-time cloud provider.