Have you looked into kinds of mitigations that cryptography offers? I’m not an expert, but I would expect there are ways to balance some degree of anonymity with some degree of human identity verification.
Perhaps there are some experts out there who can comment?
I like the digital signature approach in general, and have argued for it before, but this is the weak link. For photos and video, this might be OK if there's a way to reliably distinguish "photos of real things" from "photos of AI images"; for plain text, you basically need a keystroke-authenticating keyboard on a computer with both internet access and copy and paste functionality securely disabled -- and then you still need an authenticating camera on the user the whole time to make sure they aren't just asking Gemini on their phone and typing its answer in.
I understand the technical side of the suggestion. The social and practical side is inevitably flawed.
You need some sort of global registry of public keys. Not only does each registrar have to be trusted but you also need to both trust every single real person to protect and not misuse their keys.
Leaving aside the complete practical infeasability that, even if you accomplish it, you now have a unique identifier tied to every piece of text. There will inevitably be both legal processes to identify who produce a signed work as well as data analysis approaches to deanonamize the public keys.
The end result is pretty clearly that anyone wishing to present material that purports to be human made has to forgo anonymity/pseudonymity. Claiming otherwise is like claiming we can have a secure government backdoor for encryption.
I would also argue that those techniques do greatly reduce privacy and anonymity.
Which is why I say it would destroy privacy/pseudonymity.
> For photos and video, this might be OK if there's a way to reliably distinguish "photos of real things" from "photos of AI images";
I suspect if you think about it, many of the issues with text also apply to images and videos.
You'd need a secure enclave You'd need a chain of signatures and images to allow human editing. You'd need a way of revoking the public keys of not just insecure software, but bad actors. You would need verified devices to prevent allowing AI tooling the using software to edit the image....etc.
This are only the flaws I can think of in like 5 minutes. You've created a huge incentive to break an incredibly complex system. I have no problem comfortably saying that the end result is a complete lack of privacy for most people while those with power/knowledge would still be able to circumvent it.