←back to thread

2603 points mattsolle | 2 comments | | HN request time: 0.543s | source
Show context
elmo2you ◴[] No.25076037[source]
Sincerely and without any intention to troll or be sarcastic: I'm puzzled that people are willing buy a computer/OS where (apparently) software can/will fail to launch if some central company server goes down. Maybe I'm just getting this wrong, because I can honestly not quite wrap my head around this. This is such a big no-go, from a systems design point of view.

Even beyond unintentional glitches at Apple, just imagine what this could mean when traffic to this infra is disrupted intentionally (e.g. to any "unfavorable" country). That sounds like a really serious cyber attack vector to me. Equally dangerous if infra inside the USA gets compromised, if that is going to make Apple computers effectively inoperable. Not sure how Apple will shield itself from legal liability in such an event, if things are intentionally designed this way. I seriously doubt that a cleverly crafted TOS/EULA will do it, for the damage might easily go way beyond to just users in this case.

Again, maybe (and in fact: hopefully) I'm just getting this all wrong. If not, I might know a country or two where this could even warrant a full ban on the sale of Apple computers, if there is no local/national instance of this (apparently crucial) infrastructure operating in that country itself, merely on the argument of national security (and in this case a very valid one, for a change).

All in all, this appears to be a design fuck-up of monumental proportions. One that might very well deserve to have serious legal ramifications for Apple.

replies(35): >>25076070 #>>25076108 #>>25076117 #>>25076130 #>>25076131 #>>25076194 #>>25076232 #>>25076348 #>>25076377 #>>25076414 #>>25076421 #>>25076460 #>>25076514 #>>25076630 #>>25076635 #>>25076649 #>>25076707 #>>25076786 #>>25076858 #>>25076908 #>>25076965 #>>25077109 #>>25077171 #>>25077401 #>>25077488 #>>25077655 #>>25077729 #>>25077764 #>>25077960 #>>25078164 #>>25078511 #>>25078513 #>>25079215 #>>25080127 #>>25108729 #
AngusH ◴[] No.25076117[source]
Apple, for some reason, didn't advertise this change very widely, so it isn't precisely an informed decision.

Like so much of the modern security activity, it doesn't seem to be fully thought out, nor was the possibility of failure considered.

Or maybe such failures were considered and then dismissed? I don't know.

replies(1): >>25077364 #
sroussey ◴[] No.25077364[source]
It times out and the app runs, so the failure mode was considered.

They may move to edge servers instead of centralized datacenters now though...

replies(1): >>25077609 #
eric_h ◴[] No.25077609[source]
> the failure mode was considered

Considered but not tuned. I've never noticed any delay launching or using software that doesn't require an internet connection while not being connected to the internet. (I definitely did notice slowdowns today - Zoom in particular which I tend to quit out of when I'm not using it because I don't trust it one bit but am compelled to use it for work)

Seems like apple was accepting connections for the signature check but were unable to actually service the connections, leading to the timeout/failover.

I honestly like the idea of signature checks on software that give me some confidence that the code that is running is the code that it claimed to be when it was published/installed and has not been manipulated via some other vector.

Whether apple is the appropriate steward of that system is certainly up for debate, but certainly other companies that run app stores have similar systems and similar risk. It certainly doesn't seem obvious to me what a secure, anonymous, performant and federated system to solve such a problem would actually look like.

replies(1): >>25078027 #
1. sroussey ◴[] No.25078027[source]
Until software can be proven to not be malicious, we will be stuck in a trust hierarchy
replies(1): >>25078050 #
2. eric_h ◴[] No.25078050[source]
Indeed. And that seems to be a bar from which we are very far away.