←back to thread

354 points qingcharles | 1 comments | | HN request time: 0.205s | source
Show context
goblin89 ◴[] No.43748669[source]
Wait until physical camera makers not only license you the unit, but also make everything you shoot belong to them, like software camera apps (e.g., Filmic Pro) do now.

DJI can just add some mandatory firmware upgrade process that offloads your footage to the mothership, and 99.9999% will agree to everything without reading.

replies(1): >>43748758 #
mitthrowaway2 ◴[] No.43748758[source]
Might be a realistic way for manufacturers to to implement a certified-taken-by-camera-not-AI photo feature.
replies(5): >>43748837 #>>43748899 #>>43748942 #>>43749487 #>>43749649 #
LeafItAlone ◴[] No.43748942[source]
>Might be a realistic way for manufacturers to to implement a certified-taken-by-camera-not-AI photo feature.

How would that work? I would imagine that any system to implement this would necessarily be something that AI tools could replicate, wouldn’t it?

replies(1): >>43749172 #
maronato ◴[] No.43749172[source]
Using encryption. When you take a picture, the device or app creates a signature using the photo data and metadata.

Then you can check the signature using the company’s public keys.

If you make edits to it, the editing app will package the new metadata, edited photo data, the original signature, and sign it again.

Now you have a chain of “changes” and can inspect and validate its history. It works for video and audio too.

As long as the private keys aren’t leaked, there’ll be no way to fabricate the signatures.

https://c2pa.org/

replies(3): >>43749640 #>>43750187 #>>43751338 #
1. goblin89 ◴[] No.43749640[source]
This has existed for a while and it does not require licensing your footage to camera maker.