←back to thread

115 points miles | 3 comments | | HN request time: 0s | source
Show context
nbzso ◴[] No.32463622[source]
After CSAM, I switched the main office machines to Manjaro. A lot of my colleagues sold their beloved iPhones and moved to De-Googled Android.

Luckily, the fiasco with Little Snitch was resolved, and I can enjoy a mac for occasional design work, but Figma is running well in the Browser and if I see any attempt to push me into Ad Hell, it is totally over.

Vertical integration is important for Apple, but as an old Apple user, this will be "no-fly zone" for me.

Actually, CSAM had a positive in my life. Since then, I am using my smartphone less and with basic apps, some banking, calls and chatting.

replies(5): >>32463767 #>>32463842 #>>32464223 #>>32464271 #>>32464340 #
CharlesW ◴[] No.32463767[source]
What does CSAM have to do with this story on how Apple might grow their ad business?
replies(3): >>32463932 #>>32463953 #>>32464544 #
MBCook ◴[] No.32463953[source]
Nothing. This story is an opportunity to Apple bash, so they’re doing it.

Personally I’m deep in the Apple ecosystem. I don’t think I’d leave because of this ads idea, but I share would get pissed off a lot because of it.

replies(2): >>32464079 #>>32464205 #
nbzso ◴[] No.32464205[source]
I still use Apple computers for design work, properly monitored by Little Snitch and some modifications on my router.

We all have opinions. And I have invested a ton of money in Apple products in 20+ years of usage. Constructive criticism is not "Bashing".

CSAM was a blatant attempt for breaching user privacy and classification with third party agency hidden criteria. We live in a surveillance economy and ads are the highway for privacy abuse. Apple argument: "We don't share the user data" has no ground for me.

In the increasingly connected world, with billions of data points, trusting a big tech behemoth is a suicide act. Allegedly. :)

replies(1): >>32464325 #
CharlesW ◴[] No.32464325[source]
> CSAM was a blatant attempt for breaching user privacy and classification with third party agency hidden criteria.

I feel like there's a good amount of FUD about this, so for anyone who might not know: All online file hosts do CSAM matching against known CSAM images, regardless of the client OS(s) you're using. In Apple's case specifically, matching only happens to images you've uploaded to iCloud Photos.¹

¹ https://www.apple.com/child-safety/pdf/Expanded_Protections_...

replies(1): >>32464356 #
1. nbzso ◴[] No.32464356[source]
I trust the experts on this.

Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life.

"That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change."

https://www.eff.org/deeplinks/2021/08/apples-plan-think-diff...

replies(2): >>32464871 #>>32467124 #
2. lttlrck ◴[] No.32464871[source]
The point is Apple was already reacting to an external pressure. They don't make up the rules. They were attempting to be open about it, and also give the user options to avoid it (don't use iCloud).

It's great that you've found an alternative that suits you but I think it's disingenuous to argue that Apple is the culprit.

Talk to your representative.

3. simonh ◴[] No.32467124[source]
My next door neighbour owning a sledgehammer is a "fully built system just waiting" to bash my door down, but I don't lose any sleep over it.