Most active commenters
  • JimDabell(9)

←back to thread

358 points ofalkaed | 23 comments | | HN request time: 0.001s | source | bottom

Just curious and who knows, maybe someone will adopt it or develop something new based on its ideas.
1. JimDabell ◴[] No.45554957[source]
Apple’s scanning system for CSAM. The vast majority of the debate was dominated by how people imagined it worked, which was very different to how it actually worked.

It was an extremely interesting effort where you could tell a huge amount of thought and effort went into making it as privacy-preserving as possible. I’m not convinced it’s a great idea, but it was a substantial improvement over what is in widespread use today and I wanted there to be a reasonable debate on it instead of knee-jerk outrage. But congrats, I guess. All the cloud hosting systems scan what they want anyway, and the one that was actually designed with privacy in mind got screamed out of existence by people who didn’t care to learn the first thing about it.

replies(4): >>45554967 #>>45555004 #>>45556180 #>>45576268 #
2. JoshTriplett ◴[] No.45554967[source]
Good riddance to a system that would have provided precedent for client-side scanning for arbitrary other things, as well as likely false positives.

> I wanted there to be a reasonable debate on it

I'm reminded of a recent hit-piece about Chat Control, in which one of the proponent politicians was quoted as complaining about not having a debate. They didn't actually want a debate, they wanted to not get backlash. They would never have changed their minds, so there's no grounds for a debate.

We need to just keep making it clear the answer is "no", and hopefully strengthen that to "no, and perhaps the massive smoking crater that used to be your political career will serve as a warning to the next person who tries".

replies(2): >>45554977 #>>45555009 #
3. JimDabell ◴[] No.45554977[source]
I don’t think you can accurately describe it as client-side scanning and false positives were not likely. Depending upon how you view it, false positives were either extremely unlikely, or 100% guaranteed for practically everybody. And if you think the latter part is a problem, please read up on it!

> I'm reminded of a recent hit-piece about Chat Control, in which one of the proponent politicians was quoted as complaining about not having a debate. They didn't actually want a debate, they wanted to not get backlash. They would never have changed their minds, so there's no grounds for a debate.

Right, well I wanted a debate. And Apple changed their minds. So how is it reminding you of that? Neither of those things apply here.

replies(1): >>45555770 #
4. drnick1 ◴[] No.45555004[source]
There is no place for spyware of any kind on my phone. Saying that it is to "protect the children" and "to catch terrorists" does not make it any more acceptable.
replies(1): >>45556793 #
5. btown ◴[] No.45555009[source]
This. No matter how cool the engineering might have been, from the perspective of what surveillance policies it would have (and very possibly did) inspire/set precedent for… Apple was very much creating the Torment Nexus from “Don’t Create the Torment Nexus.”
replies(1): >>45555030 #
6. JimDabell ◴[] No.45555030{3}[source]
> from the perspective of what surveillance policies it would have (and very possibly did) inspire/set precedent for…

I can’t think of a single thing that’s come along since that is even remotely similar. What are you thinking of?

I think it’s actually a horrible system to implement if you want to spy on people. That’s the point of it! If you wanted to spy on people, there are already loads of systems that exist which don’t intentionally make it difficult to do so. Why would you not use one of those models instead? Why would you take inspiration from this one in particular?

replies(2): >>45555213 #>>45555748 #
7. JoshTriplett ◴[] No.45555213{4}[source]
> I can’t think of a single thing that’s come along since that is even remotely similar. What are you thinking of?

Chat Control, and other proposals that advocate backdooring individual client systems.

Clients should serve the user.

replies(1): >>45555362 #
8. JimDabell ◴[] No.45555362{5}[source]
> Chat Control, and other proposals that advocate backdooring individual client systems.

Chat Control is older than Apple’s CSAM scanning and is very different from it.

> Clients should serve the user.

Apple’s system only scanned things that were uploaded to iCloud.

You missed the most important part of my comment:

> I think it’s actually a horrible system to implement if you want to spy on people. That’s the point of it! If you wanted to spy on people, there are already loads of systems that exist which don’t intentionally make it difficult to do so. Why would you not use one of those models instead? Why would you take inspiration from this one in particular?

9. btown ◴[] No.45555748{4}[source]
The problem isn’t the system as implemented; the problem is the very assertion “it is possible to preserve the privacy your constituents want, while running code at scale that can detect Bad Things in every message.”

Once that idea appears, it allows every lobbyist and insider to say “mandate this, we’ll do something like what Apple did but for other types of Bad People” and all of a sudden you have regulations that force messaging systems to make this possible in the name of Freedom.

Remember: if a model can detect CSAM at scale, it can also detect anyone who possesses any politically sensitive image. There are many in politics for whom that level of control is the actual goal.

replies(1): >>45556052 #
10. mixmastamyk ◴[] No.45555770{3}[source]
Forgot about the concept of bugs have we? How about making Apple vulnerable to demands from every government where they do business?

No thanks. I'll take a hammer to any device in my vicinity that implements police scanning.

replies(1): >>45556021 #
11. JimDabell ◴[] No.45556021{4}[source]
> Forgot about the concept of bugs have we?

No, but I have a hard time imagining a bug that would meaningfully compromise this kind of system. Can you give an example?

> How about making Apple vulnerable to demands from every government where they do business?

They already are. So are Google, Meta, Microsoft, and all the other giants we all use. And all those other companies are already scanning your stuff. Meta made two million reports in 2024Q4 alone.

replies(1): >>45559124 #
12. JimDabell ◴[] No.45556052{5}[source]
> The problem isn’t the system as implemented

Great!

> the problem is the very assertion “it is possible to preserve the privacy your constituents want, while running code at scale that can detect Bad Things in every message.”

Apple never made that assertion, and the system they designed is incapable of doing that.

> if a model can detect CSAM at scale, it can also detect anyone who possesses any politically sensitive image.

Apple’s system cannot do that. If you change parts of it, sure. But the system they proposed cannot.

To reiterate what I said earlier:

> The vast majority of the debate was dominated by how people imagined it worked, which was very different to how it actually worked.

So far, you are saying that you don’t have a problem with the system Apple designed, and you do have a problem with some other design that Apple didn’t propose, that is significantly different in multiple ways.

Also, what do you mean by “model”? When I used the word “model” it was in the context of using another system as a model. You seem to be using it in the AI sense. You know that’s not how it worked, right?

13. eviks ◴[] No.45556180[source]
> The vast majority of the debate was dominated by how people imagined it worked, which was very different to how it actually worked.

But not very different to how it was actually going to work, as you say:

> If you change parts of it, sure.

Now try to reason your way out of the obvious "parts of it will definitely change" knee-jerk.

replies(1): >>45556219 #
14. JimDabell ◴[] No.45556219[source]
I’m not sure I’m understanding you.

Apple designed a system. People guessed at what it did. Their guesses were way off the mark. This poisoned all rational discussion on the topic. If you imagine a system that works differently to Apple’s system, you can complain about that imaginary system all you want, but it won’t be meaningful, it’s just noise.

replies(1): >>45556244 #
15. eviks ◴[] No.45556244{3}[source]
You understand it just fine, you're just trying to pass you fantasy pod immutable safe future as rational while painting the obvious objections based on the real world as meaningless noise.
replies(1): >>45557340 #
16. eimrine ◴[] No.45556793[source]
Do you have any phones without spyware?

I believe my retro Nokia phones s60/s90 does not have any spyware. I believe earlier Nokia models like s40 or monochrome does not even have an ability to spy on me (but RMS considers triangulation as spyware). I don't believe any products from the duopoly without even root access are free from all kinds of vendor's rootkits.

replies(2): >>45561477 #>>45562204 #
17. JimDabell ◴[] No.45557340{4}[source]
Your point did not come across. It still isn’t. I don’t know what you mean by “pass you fantasy pod immutable safe future as rational”. You aren’t making sense to me. I absolutely do not “understand it just fine”.
replies(1): >>45557789 #
18. pessimizer ◴[] No.45557789{5}[source]
If they are running safe mandatory scans on your phones for this, you seem shocked and angry that anyone would imply that this would lead to safe mandatory scans on your phones for that and the other, and open the door for unsafe mandatory scans for whatever.

If you can't acknowledge this, it puts you in a position where you can't be convincing to people who need you to deflect obvious, well-known criticisms before beginning a discussion. It gives you crazy person or salesman vibes. These are arguments that someone with a serious interest in the technology would be aware of already and should be included as a prerequisite to being taken seriously. Doing this shows that you value other people's time and effort.

replies(1): >>45557913 #
19. JimDabell ◴[] No.45557913{6}[source]
> you seem shocked and angry that anyone would imply that this would lead to safe mandatory scans on your phones for that and the other

Where have I given you that impression? The thing that annoys me is the sensible discussion being drowned out by ignorance.

> If you can't acknowledge this, it puts you in a position where you can't be convincing to people who need you to deflect obvious, well-known criticisms before beginning a discussion.

I cannot parse this, it’s word salad. People who need me to deflect criticisms? What? I genuinely do not understand what you are trying to say here. Maybe just break the sentences up into smaller ones? It feels like you’re trying to say too many things in too few sentences. What people? Why do they need me to deflect criticisms?

20. mixmastamyk ◴[] No.45559124{5}[source]
Imagine harder. Apple has had several high profile security bugs in the last few years, and their OS is decried here as a buggy mess every release. QA teams went out of fashion.

The onus is on you to prove perfection before ruining lives on hardware they paid for.

100x worse on the vulnerability front, as the tech could be bent to any whim. Importantly, none of what you described is client-side scanning. Even I consider abiding rules on others’ property fair.

21. drnick1 ◴[] No.45561477{3}[source]
Graphene, Lineage and various Linux distributions for phones come to mind.
22. zweifuss ◴[] No.45562204{3}[source]
Silent SMS (Short Message Type 0) have been around since 1996.
23. morshu9001 ◴[] No.45576268[source]
Was the backlash actually what ended this project? As much as I'd like to pat myself on the back, there must've been another reason.