←back to thread

110 points jonbaer | 1 comments | | HN request time: 0.001s | source
Show context
utilize1808 ◴[] No.45073112[source]
I feel this is not the scalable/right way to approach this. The right way would be for human creators to apply their own digital signatures to the original pieces they created (specialised chips on camera/in software to inject hidden pixel patterns that are verifiable). If a piece of work lacks such signature, it should be considered AI-generated by default.
replies(3): >>45073155 #>>45073302 #>>45073834 #
shkkmo ◴[] No.45073155[source]
That seems like a horrible blow to anonymity and psuedonymity that would also empower identity thieves.
replies(3): >>45073244 #>>45073831 #>>45074209 #
utilize1808 ◴[] No.45073244[source]
Not necessarily. It’s basically document signing with key pairs —- old tech that is known to work. It’s purpose is not to identify the individual creators, but to verify that a piece of work was created by a process/device that is not touched by AI.
replies(2): >>45073863 #>>45076968 #
1. shkkmo ◴[] No.45076968{3}[source]
> It’s basically document signing with key pairs —- old tech that is known to work.

I understand the technical side of the suggestion. The social and practical side is inevitably flawed.

You need some sort of global registry of public keys. Not only does each registrar have to be trusted but you also need to both trust every single real person to protect and not misuse their keys.

Leaving aside the complete practical infeasability that, even if you accomplish it, you now have a unique identifier tied to every piece of text. There will inevitably be both legal processes to identify who produce a signed work as well as data analysis approaches to deanonamize the public keys.

The end result is pretty clearly that anyone wishing to present material that purports to be human made has to forgo anonymity/pseudonymity. Claiming otherwise is like claiming we can have a secure government backdoor for encryption.