←back to thread

110 points jonbaer | 2 comments | | HN request time: 0s | source
Show context
utilize1808 ◴[] No.45073112[source]
I feel this is not the scalable/right way to approach this. The right way would be for human creators to apply their own digital signatures to the original pieces they created (specialised chips on camera/in software to inject hidden pixel patterns that are verifiable). If a piece of work lacks such signature, it should be considered AI-generated by default.
replies(3): >>45073155 #>>45073302 #>>45073834 #
shkkmo ◴[] No.45073155[source]
That seems like a horrible blow to anonymity and psuedonymity that would also empower identity thieves.
replies(3): >>45073244 #>>45073831 #>>45074209 #
1. taminka ◴[] No.45073831[source]
there's very likely already some sort of fingerprinting in camera chips, à la printer yellow dot watermarks that uniquely identify a printer and a print job...
replies(1): >>45077007 #
2. shkkmo ◴[] No.45077007[source]
The way those work is primarily through a combination of obscurity (most people don't know they exist) and through a lack of real finacial incentive to break them at scale.

I would also argue that those techniques do greatly reduce privacy and anonymity.