Most active commenters
  • talldayo(5)
  • commandersaki(5)
  • astrange(4)
  • afh1(3)
  • fsflover(3)
  • (3)
  • hulitu(3)
  • threeseed(3)

←back to thread

295 points djoldman | 77 comments | | HN request time: 0.64s | source | bottom
1. solarkraft ◴[] No.42063965[source]
Sibling comments point out (and I believe, corrections are welcome) that all that theater is still no protection against Apple themselves, should they want to subvert the system in an organized way. They’re still fully in control. There is, for example, as far as I understand it, still plenty of attack surface for them to run different software than they say they do.

What they are doing by this is of course to make any kind of subversion a hell of a lot harder and I welcome that. It serves as a strong signal that they want to protect my data and I welcome that. To me this definitely makes them the most trusted AI vendor at the moment by far.

replies(10): >>42064235 #>>42064286 #>>42064293 #>>42064535 #>>42064716 #>>42066343 #>>42066619 #>>42067410 #>>42068246 #>>42069486 #
2. patmorgan23 ◴[] No.42064235[source]
Yep. If you don't trust apple with your data, don't buy a device that runs apples operating system
replies(4): >>42064785 #>>42066409 #>>42066447 #>>42070247 #
3. tw04 ◴[] No.42064286[source]
As soon as you start going down the rabbit hole of state sponsored supply chain alteration, you might as well just stop the conversation. There's literally NOTHING you can do to stop that specific attack vector.

History has shown, at least to date, Apple has been a good steward. They're as good a vendor to trust as anyone. Given a huge portion of their brand has been built on "we don't spy on you" - the second they do they lose all credibility, so they have a financial incentive to keep protecting your data.

replies(8): >>42065378 #>>42065849 #>>42065988 #>>42066649 #>>42067097 #>>42067858 #>>42068698 #>>42069588 #
4. chadsix ◴[] No.42064293[source]
Exactly. You can only trust yourself [1] and should self host.

[1] https://www.youtube.com/watch?v=g_JyDvBbZ6Q

replies(2): >>42064538 #>>42065335 #
5. stavros ◴[] No.42064535[source]
> that all that theater is still no protection against Apple themselves

There is such a thing as threat modeling. The fact that your model only stops some threats, and not all threats, doesn't mean that it's theater.

replies(1): >>42067834 #
6. 9dev ◴[] No.42064538[source]
That is an answer for an incredibly tiny fraction of the population. I'm not so much concerned about myself than society in general, and self-hosting just is not a viable solution to the problem at hand.
replies(2): >>42064839 #>>42065533 #
7. halJordan ◴[] No.42064716[source]
Its not that they couldn't, its that they couldn't without a watcher knowing. And frankly this tradeoff is not new, nor is it unacceptable in anything other than "Muh Apple"
8. yndoendo ◴[] No.42064785[source]
That is good in theory. Reality, anyone you engage with that uses an Apple device has leaked your content / information to Apple. High confidence that Apple could easily build profiles on people that do not use their devices via this indirect action of having to communicate with Apple devices owners.

That statement above also applies to Google. There is now way not prevent indirect data sharing with Apple or Google.

replies(3): >>42065011 #>>42065466 #>>42065965 #
9. chadsix ◴[] No.42064839{3}[source]
To be fair, it's much easier than one can imagine (try ollama on macOS for example). In the end, Apple wrote a lot of longwinded text, but the summary is "you have to trust us."

I don't trust Apple - in fact, even the people we trust the most have told us soft lies here and there. Trust is a concept like an integral - you can only get to "almost" and almost is 0.

So you can only trust yourself. Period.

replies(4): >>42065107 #>>42065178 #>>42065410 #>>42069651 #
10. hnaccount_rng ◴[] No.42065011{3}[source]
Yes, if your thread model includes the provider of your operating system, then you cannot win. It's really that simple. You fundamentally need to trust your operating system because it can just lie to you
replies(2): >>42067159 #>>42067843 #
11. dotancohen ◴[] No.42065107{4}[source]
I don't even trust myself, I know that I'm going to mess up at some point or another.
12. lukev ◴[] No.42065178{4}[source]
The odds that I make a mistake in my security configuration are much higher than the odds that Apple is maliciously backdooring themselves.

The PCC model doesn't guarantee they can't backdoor themselves, but it does make it more difficult for them.

replies(1): >>42070748 #
13. remram ◴[] No.42065335[source]
Can you trust the hardware?
replies(2): >>42065435 #>>42066950 #
14. talldayo ◴[] No.42065378[source]
...in certain places: https://support.apple.com/en-us/111754

Just make absolutely sure you trust your government when using an iDevice.

replies(2): >>42066085 #>>42069163 #
15. killjoywashere ◴[] No.42065410{4}[source]
There are multiple threat models where you can't trust yourself.

Your future self definitely can't trust your past self. And vice versa. If your future self has a stroke tomorrow, did your past self remember to write a living will? And renew it regularly? Will your future self remember that password? What if the kid pukes on the carpet before your past self writes it down?

Your current self is not statistically reliable. Andrej Karpathy administered an imagenet challenge to himself, his brain as the machine: he got about 95%.

I'm sure there are other classes of self-failure.

replies(1): >>42067431 #
16. killjoywashere ◴[] No.42065435{3}[source]
There's a niche industry that works on that problem: looking for evidence of tampering down to the semiconductor level.
replies(1): >>42065691 #
17. dialup_sounds ◴[] No.42065466{3}[source]
Define "content / information".
18. talldayo ◴[] No.42065533{3}[source]
Nobody promised you that real solutions would work for everyone. Performing CPR to save a life is something "an incredibly tiny fraction of the population" is trained on, but it does work when circumstances call for it.

It sucks, but what are you going to do for society? Tell them all to sell their iPhones, punk out the NSA like you're Snowden incarnate? Sometimes saving yourself is the only option, unfortunately.

replies(1): >>42067377 #
19. sourcepluck ◴[] No.42065691{4}[source]
Notably https://www.bunniestudios.com/blog/2020/introducing-precurso...
20. afh1 ◴[] No.42065849[source]
> There's literally NOTHING you can do to stop that specific attack vector.

E2E. Might not be applicable for remote execution of AI payloads, but it is applicable for most everything else, from messaging to storage.

Even if the client hardware and/or software is also an actor in your threat model, that can be eliminated or at least mitigated with at least one verifiably trusted piece of equipment. Open hardware is an alternative, and some states build their entire hardware stack to eliminate such threats. If you have at least one trusted equipment mitigations are possible (e.g. external network filter).

replies(1): >>42066004 #
21. afh1 ◴[] No.42065965{3}[source]
Depending on your social circle such exposure is not so hard to avoid. Maybe you cannot avoid it entirely but it may be low enough that it doesn't matter. I have older relatives with basically zero online presence.
22. ferbivore ◴[] No.42065988[source]
Apple have name/address/credit-card/IMEI/IMSI tuples stored for every single Apple device. iMessage and FaceTime leak numbers, so they know who you talk to. They have real-time location data. They get constant pings when you do anything on your device. Their applications bypass firewalls and VPNs. If you don't opt out, they have full unencrypted device backups, chat logs, photos and files. They made a big fuss about protecting you from Facebook and Google, then built their own targeted ad network. Opting out of all tracking doesn't really do that. And even if you trust them despite all of this, they've repeatedly failed to protect users even from external threats. The endless parade of iMessage zero-click exploits was ridiculous and preventable, CKV only shipped this year and isn't even on by default, and so on.

Apple have never been punished by the market for any of these things. The idea that they will "lose credibility" if they livestream your AI interactions to the NSA is ridiculous.

replies(4): >>42069201 #>>42069206 #>>42069568 #>>42070213 #
23. warkdarrior ◴[] No.42066004{3}[source]
E2E does not protect metadata, at least not without significant overheads and system redesigns. And metadata is as important as data in messaging and storage.
replies(1): >>42066083 #
24. afh1 ◴[] No.42066083{4}[source]
> And metadata is as important as data in messaging and storage.

Is it? I guess this really depends. For E2E storage (e.g. as offered by Proton with openpgpjs), what metadata would be of concern? File size? File type cannot be inferred, and file names could be encrypted if that's a threat in your model.

replies(2): >>42066793 #>>42067070 #
25. jayrot ◴[] No.42066085{3}[source]
>Just make absolutely sure you trust your government

This sentence stings right now. :-(

26. isodev ◴[] No.42066343[source]
Indeed, the attestation process, as described by the article, is more geared towards unauthorized exfiltration of information or injection of malicious code. However, "authorized" activities are fully supported where that means signed by Apple. So, ultimately, users need to trust that Apple is doing the right thing, just like any other company. And yes, it means they can be forced (by law) not to do the right thing.
27. isodev ◴[] No.42066409[source]
That really is not a valid argument, since Apple have grown to be "the phone".

Also, many are unaware or unable to make the determination who or what will own their data before purchasing a device. One only accepts the privacy policy after one taps sign in... and is it really practical to expect people to do this by themselves when buying a phone? That's why regulation needs to step-in and enforce the right decisions are present by default.

28. mossTechnician ◴[] No.42066447[source]
But if you don't trust Google with your data, you can buy a device that runs Google's operating system, from Google, and flash somebody else's operating system onto it.

Or, if you prefer, you can just look at Google's code and verify that the operating system you put on your phone is made with the code you looked at.

29. natch ◴[] No.42066619[source]
You're getting taken in by a misdirection.

>for them to run different software than they say they do.

They don't even need to do that. They don't need to do anything different than they say.

They already are saying only that the data is kept private from <insert very limited subset of relevant people here>.

That opens the door wide for them to share the data with anyone outside of that very limited subset. You just have to read what they say, and also read between the lines. They aren't going to say who they share with, apparently, but they are going to carefully craft what they say so that some people get misdirected.

replies(1): >>42070767 #
30. natch ◴[] No.42066649[source]
As to the trust loss, we seem to be already past that. It seems to me they are now in the stage of faking it.
31. mbauman ◴[] No.42066793{5}[source]
The most valuable "metadata" in this context is typically with whom you're communicating/collaborating and when and from where. It's so valuable it should just be called data.
replies(1): >>42066915 #
32. fsflover ◴[] No.42066915{6}[source]
How is this relevant to the private cloud storage?
replies(1): >>42067399 #
33. blitzar ◴[] No.42066950{3}[source]
If you make your own silicon can you trust that the sand hasnt been tampered with to breech your security?
34. ◴[] No.42067070{5}[source]
35. hulitu ◴[] No.42067097[source]
> History has shown, at least to date, Apple has been a good steward.

cough* HW backdoor in iPhone cough*

replies(1): >>42069415 #
36. fsflover ◴[] No.42067159{4}[source]
This is false. With FLOSS and reproducible builds, you can rely on the community for verification.
replies(1): >>42070042 #
37. ◴[] No.42067377{4}[source]
38. Jerrrrrrry ◴[] No.42067399{7}[source]
No point in storing data if it is never shared with anyone else.

Whom it is shared with can infer the intent of the data.

replies(1): >>42069525 #
39. 1vuio0pswjnm7 ◴[] No.42067410[source]
"Sibling comments point out (and I believe, corrections are welcome) that all that theater is still no protection against Apple themselves, should they want to subvert the system in an organized way. They're still fully in control."

It stands to reason that that control is a prerequisite for "security".

Apple does not delegate its own "security" to someone else, a "steward". Hmmm.

Yet it expects computer users to delegate control to Apple.

Apple is not alone in this regard. It's common for "Big Tech", "security researchers" and HN commenters to advocate for the computer user to delegate control to someone else.

40. martinsnow ◴[] No.42067431{5}[source]
Given the code quality of projects like nextcloud. Suggestions like this makes the head and table transmugify into magnets.
41. hulitu ◴[] No.42067834[source]
> The fact that your model only stops some threats, and not all threats, doesn't mean that it's theater.

Well, to be honest, theater is a pretentious word in this context. A better word will be shitshow.

(i never heard of a firewall that claims it filters _some_ packets, or an antivirus that claims that it protects against _some_ viruses)

replies(1): >>42068276 #
42. hulitu ◴[] No.42067843{4}[source]
> You fundamentally need to trust your operating system because it can just lie to you

Trust us, we are liars. /s

43. vlovich123 ◴[] No.42067858[source]
Strictly speaking there's homomorphic encryption. It's still horribly slow and expensive but it literally lets you run compute on untrusted hardware in a way that's mathematically provable.
replies(2): >>42069609 #>>42069845 #
44. derefr ◴[] No.42068246[source]
The "we've given this code to a third party to host and run" part can be a 100% effective stop to any Apple-internal shenanigans. It depends entirely on what the third party is legally obligated to do for them. (Or more specifically, what they're legally obligated to not do for them.)

A simple example of the sort of legal agreement I'm talking about, is a trust. A trust isn't just a legal entity that takes custody of some assets and doles them out to you on a set schedule; it's more specifically a legal entity established by legal contract, and executed by some particular law firm acting as its custodian, that obligates that law firm as executor to provide only a certain "API" for the contract's subjects/beneficiaries to interact with/manage those assets — a more restrictive one than they would have otherwise had a legal right to.

With trusts, this is done because that restrictive API (the "you can't withdraw the assets all at once" part especially) is what makes the trust a trust, legally; and therefore what makes the legal (mostly tax-related) benefits of trusts apply, instead of the trust just being a regular holding company.

But you don't need any particular legal impetus in order to create this kind of "hold onto it and don't listen to me if I ask for it back" contract. You can just... write a contract that has terms like that; and then ask a law firm to execute that contract for you.

Insofar as Apple have engaged with some law firm to in turn engage with a hosting company; where the hosting company has obligations to the law firm to provide a secure environment for the law firm to deploy software images, and to report accurate trusted-compute metrics to the law firm; and where the law firm is legally obligated to get any image-updates Apple hands over to them independently audited, and only accept "justifiable" changes (per some predefined contractual definition of "justifiable") — then I would say that this is a trustworthy arrangement. Just like a trust is a trust-worthy arrangement.

replies(1): >>42069647 #
45. stavros ◴[] No.42068276{3}[source]
Really? Please show me an antivirus that claims that it protects against all viruses. A firewall that filters all packets is a pair of scissors.
46. ◴[] No.42068698[source]
47. spondyl ◴[] No.42069163{3}[source]
When it comes to China, it's not entirely fair to single out Apple here given that non-Chinese companies are not allowed to run their own compute in China directly.

It always has to be operated by a sponsor in the state who hold encryption keys and do actual deployments etc etc.

The same applies to Azure/AWS/Google Cloud's China regions and any other compute services you might think of.

replies(1): >>42069787 #
48. lurking_swe ◴[] No.42069201{3}[source]
> They made a big fuss about protecting you from Facebook and Google, then built their own targeted ad network.

What kind of targeting advertising am i getting from apple as a user of their products? Genuinely curious. I’ll wait.

The rest of your comment may be factually accurate but it isn’t relevant for “normal” users, only those hyper aware of their privacy. Don’t get me wrong, i appreciate knowing this detail but you need to also realize that there are degrees to privacy.

replies(1): >>42070119 #
49. Tagbert ◴[] No.42069206{3}[source]
They have not been punished because they have not abused their access to that data.
replies(1): >>42069642 #
50. evgen ◴[] No.42069415{3}[source]
cough bullshit cough

Don't try to be subtle. If you are going to lie, go for a big lie.

51. commandersaki ◴[] No.42069486[source]
> They’re still fully in control. There is, for example, as far as I understand it, still plenty of attack surface for them to run different software than they say they do.

But any such software must be publicly verifiable otherwise it cannot be deemed secure. That's why they publish each version in a transparency log which is verified by the client and handwavy verified by public brains trust.

This is also just a tired take. The same thing could be said about passcodes on their mobile products or full disk encryption keys for the Mac line. There'd be massive loss of goodwill and legal liability if they subverted these technologies that they claim to make their devices secure.

52. fsflover ◴[] No.42069525{8}[source]
Backups?
replies(1): >>42070331 #
53. commandersaki ◴[] No.42069568{3}[source]
> If you don't opt out, they have full unencrypted device backups, chat logs, photos and files.

Also full disk encryption is opt-in for macOS. But the answer isn't that Apple wants you to be insecure, they just probably want to make it easier for their users to recover data if they forget a login password or backup password they set years ago.

> real-time location data

Locations are end to end encrypted.

replies(1): >>42070921 #
54. sunnybeetroot ◴[] No.42069588[source]
Didn’t Edward reveal Apple provides direct access to the NSA for mass surveillance?

> allows officials to collect material including search history, the content of emails, file transfers and live chats

> The program facilitates extensive, in-depth surveillance on live communications and stored information. The law allows for the targeting of any customers of participating firms who live outside the US, or those Americans whose communications include people outside the US.

> It was followed by Yahoo in 2008; Google, Facebook and PalTalk in 2009; YouTube in 2010; Skype and AOL in 2011; and finally Apple, which joined the program in 2012. The program is continuing to expand, with other providers due to come online.

https://www.theguardian.com/world/2013/jun/06/us-tech-giants...

replies(2): >>42069904 #>>42070700 #
55. commandersaki ◴[] No.42069609{3}[source]
Yeah the impetus for PCC was that homomorphic encryption wasn't feasible and this was the best realistic alternative.
56. sunnybeetroot ◴[] No.42069642{4}[source]
Some might call this abuse: https://news.ycombinator.com/item?id=42069588
57. neongreen ◴[] No.42069647[source]
This actually sounds like a very neat idea. Do you know any services / software companies that operate like that?
58. commandersaki ◴[] No.42069651{4}[source]
> "you have to trust us."

You have fundamentally misunderstood PCC.

59. talldayo ◴[] No.42069787{4}[source]
It's entirely fair. Apple had the choice to stop pursuing business in China if they felt it conflicted with values they prioritized. Evidently it doesn't, which should tell you a lot about how accepting Apple is of this behavior worldwide.
replies(2): >>42070703 #>>42070732 #
60. romac ◴[] No.42069845{3}[source]
And they are pushing in that direction: https://machinelearning.apple.com/research/homomorphic-encry...
61. theturtletalks ◴[] No.42069904{3}[source]
Didn’t Apple famously refuse the FBI’s request to unlock the San Bernardino’s attacker’s iPhone. FBI ended up hiring an Australian company which used a Mozilla bug that allows unlimited password guesses without the phone wiping.

If the NSA had that info, why go through the trouble?

replies(2): >>42069938 #>>42072847 #
62. talldayo ◴[] No.42069938{4}[source]
> If the NSA had that info, why go through the trouble?

To defend the optics of a backdoor that they actively rely on?

If Apple and the NSA are in kahoots, it's not hard to imagine them anticipating this kind of event and leveraging it for plausible deniability. I'm not saying this is necessarily what happened, but we'd need more evidence than just the first-party admission of two parties that stand to gain from privacy theater.

63. philjohn ◴[] No.42070042{5}[source]
Not unless your entire stack down to the bare silicon is also FLOSS, and the community is able to verify.

There is a lot of navel gazing in these comments about "the perfect solution", but we all know (or should know) that perfect is the enemy of good enough.

replies(1): >>42070272 #
64. talldayo ◴[] No.42070119{4}[source]
> What kind of targeting advertising am i getting from apple as a user of their products?

https://searchads.apple.com/

https://support.apple.com/guide/iphone/control-how-apple-del...

  In the App Store and Apple News, your search and download history may be used to serve you relevant search ads. In Apple News and Stocks, ads are served based partly on what you read or follow. This includes publishers you’ve enabled notifications for and the type of publishing subscription you have.
replies(1): >>42072261 #
65. threeseed ◴[] No.42070213{3}[source]
It's disingenuous to compare Apple's advertising to Facebook and Google.

Apple does first party advertising for two relatively minuscule apps.

Facebook and Google power the majority of the world's online advertising, have multiple data sharing agreements, widely deployed tracking pixels, allow for browser fingerprinting and are deeply integrated into almost all ecommerce platforms and sites.

66. threeseed ◴[] No.42070247[source]
And if you don't trust Apple with your data you shouldn't use a phone or internet at all.

Because as someone who has worked at a few telcos I can assure you that your phone and triangulated location data is stored, analysed and provided to intelligence agencies. And likewise this would be applying to ISPs.

67. threeseed ◴[] No.42070272{6}[source]
We've seen countless examples of relatively minor libraries being exploited which then cause havoc because of a spider web of transitive dependencies.
68. Jerrrrrrry ◴[] No.42070331{9}[source]
yes, got me there.

but i feel in the context (communication/meta-data inference) that is missing the trees for the forest

69. astrange ◴[] No.42070700{3}[source]
That seemed to be puffery about a database used to store subpoena requests. You have "direct access" to a service if it has a webpage you can submit subpoenas to.
70. musictubes ◴[] No.42070703{5}[source]
You don’t have to use iCloud. Customers in China can still make encrypted backups on their computers. I also believe, but please correct me if I’m wrong, that you can still do encrypted backups in China if you want.

All the pearl clutching about Apple doing business in China is ridiculous. Who would be better off if Apple withdrew from China? Sure, talldayo would sleep better knowing that Apple had passed their purity test, I guess that’s worth a lot right? God knows consumers in China would be much better off without the option to use iPhones or any other Apple devices. Their privacy and security are better protected by domestic phones I’m sure.

Seriously, what exactly is the problem?

71. astrange ◴[] No.42070732{5}[source]
iCloud E2E encryption (advanced data protection) works in China.

There are other less nefarious reasons for in-country storage laws like this. One is to stop other countries from subpoeanaing it.

But it's also so China gets the technical skills from helping you run it.

72. astrange ◴[] No.42070748{5}[source]
You also don't have a security team and Apple does have one.
73. astrange ◴[] No.42070767[source]
They're not doing that because it's obviously illegal. GDPR forbids sharing data with unknown other people.
74. dwaite ◴[] No.42070921{4}[source]
> Also full disk encryption is opt-in for macOS. But the answer isn't that Apple wants you to be insecure, they just probably want to make it easier for their users to recover data if they forget a login password or backup password they set years ago.

"If you have a Mac with Apple silicon or an Apple T2 Security Chip, your data is encrypted automatically."

The non-removable storage is I believe encrypted using a key specific to the Secure Enclave which cleared on factory reset. APFS does allow for other levels of protection though (such as protecting a significant portion of the system with a key derived from initial password/passcode, which is only enabled while the screen is unlocked).

replies(1): >>42072248 #
75. commandersaki ◴[] No.42072248{5}[source]
Yeah its a bit nuanced. You're correct encryption is automatic, but the key is unprotected unless you enable FileVault, which is the opt-in bit I was talking about.

So by default it is easy to recover data on a mac.

76. luqtas ◴[] No.42072261{5}[source]
so just stocks, apps and news? their hardware quality unsnarl me with their performance per watt every time i need to flee from civilization and program Java on Nepalese caves for _THREE_ days without plugging into a power outlet. i accept the compromise and their word on my data!

/s

77. tkz1312 ◴[] No.42072847{4}[source]
FBI already had full access to the unencrypted icloud backup from a few days prior.