←back to thread

276 points leonry | 1 comments | | HN request time: 0s | source
Show context
Arubis ◴[] No.41889117[source]
Best of luck to the author! My understanding is that anything that makes large file sharing easy and anonymous rapidly gets flooded with CSAM and ends up shuttering themselves for the good of all. Would love to see a non-invasive yet effective way to prevent such an incursion.
replies(10): >>41889269 #>>41889987 #>>41890019 #>>41890075 #>>41890376 #>>41890531 #>>41890775 #>>41892233 #>>41893466 #>>41896754 #
jart ◴[] No.41893466[source]
If governments and big tech want to help, they should upload one of their CSAM detection models to Hugging Face, so system administrators can just block it. Ideally I should be able to run a command `iscsam 123.jpg` and it prints a number like 0.9 to indicate 90% confidence that it is. No one else but them can do it, since there's obviously no legal way to train such a model. Even though we know that governments have already done it. If they won't give service operators the tools to keep abuse off their communications systems, then operators shouldn't be held accountable for what people do with them.
replies(4): >>41893921 #>>41894046 #>>41894311 #>>41898004 #
blackoil ◴[] No.41894046[source]
Perpetrators will keep tweaking image till they get score of 0.1
replies(2): >>41894419 #>>41895566 #
amelius ◴[] No.41894419[source]
How about the government running a service where you can ask them to validate an image?

Trying to tweak an image will not work because you will find the police on your doorstep.

replies(2): >>41894533 #>>41895184 #
1. charrondev ◴[] No.41895184[source]
My understanding is at that Microsoft runs such a tool and you can request access to it. (PhotoDNA). As I understand you hash an image send it to them and get back a response.