←back to thread

52 points anigbrowl | 1 comments | | HN request time: 0.652s | source
Show context
kylestanfield ◴[] No.44407748[source]
Realistic AI video generation will put us firmly in a post-truth era. If those videos were 25% more realistic it would be indistinguishable from a real interview.

The speed with which you’ll be able to create and disseminate propaganda is mind blowing. Imagine what a 3 letter agency could do with their level of resources.

replies(9): >>44407935 #>>44408117 #>>44408162 #>>44408294 #>>44408304 #>>44408347 #>>44408383 #>>44408399 #>>44410227 #
patrakov ◴[] No.44407935[source]
No, it won't.

Expected reaction: every camera manufacturer will embed chips that hold a private key used to sign and/or watermark photos and videos, thus attesting that the raw footage came from a real camera.

Now it only remains to solve the analog hole problem.

replies(4): >>44408000 #>>44408024 #>>44408046 #>>44408263 #
1. ghushn3 ◴[] No.44408000[source]
I don't think that's as trivial and invulnerable as you think it is. You are talking about a key that exists on a device, but cannot be extracted from the device. It can be used to sign a high volume of data in a unique way such that the signature cannot be transferred to another video.

Now you have another problem -- a signature is unique to a device, presumably, so you've got the "I can authenticate that this camera took this photo" problem, which is great if you are validating a press agency's photographs but TERRIBLE if you are covering a protest or human rights violation.