←back to thread

110 points jonbaer | 4 comments | | HN request time: 0s | source
Show context
utilize1808 ◴[] No.45073112[source]
I feel this is not the scalable/right way to approach this. The right way would be for human creators to apply their own digital signatures to the original pieces they created (specialised chips on camera/in software to inject hidden pixel patterns that are verifiable). If a piece of work lacks such signature, it should be considered AI-generated by default.
replies(3): >>45073155 #>>45073302 #>>45073834 #
1. HPsquared ◴[] No.45073302[source]
Then you just point the special camera at a screen showing the AI content.
replies(1): >>45073754 #
2. utilize1808 ◴[] No.45073754[source]
Sure. But then it will receive more scrutiny because you are showing a "capture" rather than the raw content.
replies(1): >>45074633 #
3. HPsquared ◴[] No.45074633[source]
Actually come to think of it, I suppose a "special camera" could also record things like focusing distance, zoom, and accelerations/rotation rates. These could be correlated to the image seen to detect this kind of thing.
replies(1): >>45080018 #
4. tough ◴[] No.45080018{3}[source]
ROC Camera does exactly this

> Creates a Zero Knowledge (ZK) Proof of the camera sensor data and other metadata

https://roc.camera/