←back to thread

110 points jonbaer | 1 comments | | HN request time: 0.215s | source
Show context
utilize1808 ◴[] No.45073112[source]
I feel this is not the scalable/right way to approach this. The right way would be for human creators to apply their own digital signatures to the original pieces they created (specialised chips on camera/in software to inject hidden pixel patterns that are verifiable). If a piece of work lacks such signature, it should be considered AI-generated by default.
replies(3): >>45073155 #>>45073302 #>>45073834 #
HPsquared ◴[] No.45073302[source]
Then you just point the special camera at a screen showing the AI content.
replies(1): >>45073754 #
utilize1808 ◴[] No.45073754[source]
Sure. But then it will receive more scrutiny because you are showing a "capture" rather than the raw content.
replies(1): >>45074633 #
HPsquared ◴[] No.45074633[source]
Actually come to think of it, I suppose a "special camera" could also record things like focusing distance, zoom, and accelerations/rotation rates. These could be correlated to the image seen to detect this kind of thing.
replies(1): >>45080018 #
1. tough ◴[] No.45080018[source]
ROC Camera does exactly this

> Creates a Zero Knowledge (ZK) Proof of the camera sensor data and other metadata

https://roc.camera/