Most active commenters
  • fsflover(5)
  • hnaccount_rng(3)

←back to thread

398 points djoldman | 14 comments | | HN request time: 2.299s | source | bottom
Show context
solarkraft ◴[] No.42063965[source]
Sibling comments point out (and I believe, corrections are welcome) that all that theater is still no protection against Apple themselves, should they want to subvert the system in an organized way. They’re still fully in control. There is, for example, as far as I understand it, still plenty of attack surface for them to run different software than they say they do.

What they are doing by this is of course to make any kind of subversion a hell of a lot harder and I welcome that. It serves as a strong signal that they want to protect my data and I welcome that. To me this definitely makes them the most trusted AI vendor at the moment by far.

replies(13): >>42064235 #>>42064286 #>>42064293 #>>42064535 #>>42064716 #>>42066343 #>>42066619 #>>42067410 #>>42068246 #>>42069486 #>>42073933 #>>42078582 #>>42088020 #
patmorgan23 ◴[] No.42064235[source]
Yep. If you don't trust apple with your data, don't buy a device that runs apples operating system
replies(4): >>42064785 #>>42066409 #>>42066447 #>>42070247 #
1. yndoendo ◴[] No.42064785[source]
That is good in theory. Reality, anyone you engage with that uses an Apple device has leaked your content / information to Apple. High confidence that Apple could easily build profiles on people that do not use their devices via this indirect action of having to communicate with Apple devices owners.

That statement above also applies to Google. There is now way not prevent indirect data sharing with Apple or Google.

replies(3): >>42065011 #>>42065466 #>>42065965 #
2. hnaccount_rng ◴[] No.42065011[source]
Yes, if your thread model includes the provider of your operating system, then you cannot win. It's really that simple. You fundamentally need to trust your operating system because it can just lie to you
replies(2): >>42067159 #>>42067843 #
3. dialup_sounds ◴[] No.42065466[source]
Define "content / information".
4. afh1 ◴[] No.42065965[source]
Depending on your social circle such exposure is not so hard to avoid. Maybe you cannot avoid it entirely but it may be low enough that it doesn't matter. I have older relatives with basically zero online presence.
5. fsflover ◴[] No.42067159[source]
This is false. With FLOSS and reproducible builds, you can rely on the community for verification.
replies(2): >>42070042 #>>42074371 #
6. hulitu ◴[] No.42067843[source]
> You fundamentally need to trust your operating system because it can just lie to you

Trust us, we are liars. /s

7. philjohn ◴[] No.42070042{3}[source]
Not unless your entire stack down to the bare silicon is also FLOSS, and the community is able to verify.

There is a lot of navel gazing in these comments about "the perfect solution", but we all know (or should know) that perfect is the enemy of good enough.

replies(2): >>42070272 #>>42074972 #
8. threeseed ◴[] No.42070272{4}[source]
We've seen countless examples of relatively minor libraries being exploited which then cause havoc because of a spider web of transitive dependencies.
replies(1): >>42074988 #
9. hnaccount_rng ◴[] No.42074371{3}[source]
You really cannot. Both from a practical point of view. Does the thing really does what _you_ want it to do? A typical OS is much too complicated to verify this (and no theorem provers just move the problem).

But also from a theoretical point of view: I give you that the source does what you want it to do (again: unrealistic). Then you still need to verify that the software deployed is the software that builds reproducibly from the source. At the end of the day you do that by getting some string of bits from some safe place and compare it to a string of bits that your software hands you. That "your software" thing can just lie!

And yes you can make that more complicated (using crypto to sign things etc.), but that just increases the complexity of the believable lie. But if your thread model is "I do not trust my phone manufacturer" than this is enough. In practice that's never the thread model though.

replies(1): >>42074938 #
10. fsflover ◴[] No.42074938{4}[source]
> does what you want it to do

What are you even talking about? We're talking about security, not 100% correctness, which is indeed not achievable. Security as in the software doesn't contain backdoors. This is much easier to verify, and even the very openness of the code will prevent many attempts at that.

Also, trust must not be 100%, as Apple is trying to train their gullible users. Oppenness is definitely not a silver bullet, but it makes backdoors less likely, thus increasing your security.

> you do [verification of reprodicible builds] by getting some string of bits from some safe place and compare it to a string of bits that your software hands you.

Exactly, and here's an example of how to do it reasonably (not perfectly!) well: https://www.qubes-os.org/security/verifying-signatures/

Also, please stop with the security nihilism: https://news.ycombinator.com/item?id=27897975

replies(1): >>42090602 #
11. fsflover ◴[] No.42074972{4}[source]
> Not unless your entire stack down to the bare silicon is also FLOSS,

https://news.ycombinator.com/item?id=27897975

12. fsflover ◴[] No.42074988{5}[source]
On Qubes OS (my daily driver), which runs everrything in VMs with strong, hardware virtualization, you can use minimal operating systems with very low number of installed libraries for security-critical actions: https://www.qubes-os.org/doc/templates/minimal/
13. hnaccount_rng ◴[] No.42090602{5}[source]
The XZ backdoor was completely in the open. It only got found because an engineer at Microsoft was far too good at controlling his environment and had too much free time to track down a 1% performance degradation. So... no, you really cannot verify that there is no backdoor. Not against a well resourced, patient adversary.

I'm not sure what your links are supposed to be proving. I'm neither of the opinion, that PCC is useless, nor am I under the misconception that a signature would provide a guarantee of non-maliciousness. All I'm saying is that, if you include Apple as an adversary in your thread model, you should not trust PCC. But not because it's closed source (or whatever) but simply because you fundamentally cannot trust the hardware and software stack that Apple completely controls all interfaces to.

Personally I don't consider this a useful thread model. But people's situation does vary

replies(1): >>42093445 #
14. fsflover ◴[] No.42093445{6}[source]
> signature would provide a guarantee of non-maliciousness

Nobody said that. A signature guarantees integrity and authorship.

> no, you really cannot verify that there is no backdoor

Again, nobody said that. I was talking about a lower probability to hide a backdoor and higher probability to find it in FLOSS.

> simply because you fundamentally cannot trust the hardware and software stack

Trust doesn't have to be binary (1 or 0). You can trust but verify.