←back to thread

350 points djoldman | 2 comments | | HN request time: 1.462s | source
Show context
jgalt212[dead post] ◴[] No.42064217[source]
[flagged]
wutwutwat ◴[] No.42064348[source]
Comments like this are extremely common on any apple post related to photos and honestly it's pretty sus that you and many others will start complaining about a thing nobody even mentioned, because of the thing you're complaining/concerned/pissed about. That's pretty telling imo and nobody ever calls it out. I'm going to start calling it out.
replies(2): >>42064503 #>>42073623 #
MaKey ◴[] No.42064503[source]
What exactly are you calling out?
replies(2): >>42064718 #>>42065342 #
jgalt212 ◴[] No.42064718[source]
Indeed. What exactly is being called out? That someone is expressing concern at the impossibility of a vendor's claims?
replies(1): >>42070338 #
wutwutwat ◴[] No.42070338[source]
calling out you complaining about child porn scanning when nobody is talking about that and it isn't what the link posted is about. Why bring up and express your dislike of a thing that 1. was never implemented and 2. was conceived to prevent the abuse of children.

People who post things like you did, unprovoked, when nobody is talking about it and it has nothing to do with the post itself is fucking weird and I'm tired of seeing it happening and nobody calling out how fucking weird it is. It happens a lot on posts about icloud or apple photos or ai image generation. Why are you posting about child porn scanning and expressing a negative view of it for no reason. Why is that what you're trying to talk about. Why is it on your mind at all. Why do you feel it's ok to post about shit like that as if you're not being a fucking creep by doing so. Why do you feel emboldened enough to think you can say or imply shit and not catch any shit for it.

replies(1): >>42073286 #
luuurker ◴[] No.42073286[source]
The feature wasn't introduced because of public backlash. It had problems and could be tricked, which at the very least should make you stop accepting everything Apple says about security and privacy at face value. On top of this, while "conceived to prevent the abuse of children", it could be easily used to target someone such as yourself for sharing a meme making fun about the president of your country (or something like that[0])... there's also the fact Apple has bent backwards just to be present in some markets (eg: China and Apple banning VPNs[1]). It doesn't take much to understand why these comments pop up on posts about images + Apple's security/privacy.

Since we're calling people out, allow me to call you out:

Wanting your devices to be private and secure or asking questions about Apple after their f-up doesn't make you a pedo or a pedo sympathiser. Comments that suggest otherwise can also be a bit "sus" (to use your expression), especially in a place like HN where users are expected to know a thing or two about tech and should be aware that the "think of the children" excuse - while good - is sometimes used to introduce technology that is then misused (eg: the internet firewall in the UK that was supposed to protect the children and now blocks sexual education stuff, torrents, etc).

I'll assume your intentions are good, but it isn't right to assume or imply that people complaining about this stuff are pedos.

[0] https://www.eff.org/deeplinks/2021/08/if-you-build-it-they-w...

[1] https://www.reuters.com/article/technology/apple-says-it-is-...

replies(1): >>42073713 #
1. wutwutwat ◴[] No.42073713[source]
this person specifically mentioned CSAM. They brought it up to complain about it being intrusive. You're defending someone who is bringing up and complaining about child porn detection when nobody was talking about it. you're defending a person who shouldn't be defended because what they are upset with is companies trying to combat CSAM.

good luck with that

replies(1): >>42074853 #
2. luuurker ◴[] No.42074853[source]
Apple specifically mentioned CSAM when announcing the system. I don't understand why you find it weird that people refer to it as the system that detected CSAM when that's essentially what Apple was calling it.

The scanning Apple wanted to do was intrusive, had flaws, and could be abused. That's why you had security researchers, the EFF, etc, speaking out against it. Not long after the announcement, people were sharing "collisions" on Github ( https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issue... ) showing that fake positives would be a problem (any alarm bells?)... which then forced Apple to say that there would be a v2 that would fix this (even though they had said that v1 was safe and secure).

On top of ignoring these issues, you seem to be under the impression that the system was only for CSAM detection. It wasn't. The system looked for content... and Apple was going to feed it a CSAM "database" to find that type of content. The problem is that Apple has to follow local rules and many governments have their own database of bad content to block (and report to the authorities)... and Apple usually complies instead of leaving the market. For example, in China, the state has access to encrypted data because Apple gave them the encryption keys per local law. They also ban censorship avoidance apps. For some reason this would be different?

If you want to insist that it was just for CSAM and that people criticising Apple are pedos or are against companies combating CSAM, then do it, but do it with the knowledge that the system wasn't just for CSAM, that it could be tricked (and possibly ruin people's lives), and that it would likely been abused by governments.