←back to thread

263 points josephcsible | 1 comments | | HN request time: 0s | source
Show context
ChrisMarshallNY ◴[] No.46178756[source]
I think we’re just getting started, with fake images and videos.

I suspect that people will be killed, because of outrage over fake stuff. Before the Ukraine invasion, some of the folks in Donbas made a fake bomb, complete with corpses from a morgue (with autopsy scars)[0]. That didn’t require any AI at all.

We can expect videos of unpopular minorities, doing horrible things, politicians saying stuff they never said, and evidence submitted to trial, that was completely made from whole cloth.

It’s gonna suck.

[0] https://www.bellingcat.com/news/2022/02/28/exploiting-cadave...

replies(7): >>46179284 #>>46179779 #>>46180021 #>>46180118 #>>46180453 #>>46180521 #>>46180873 #
uyzstvqs ◴[] No.46180873[source]
People were able to make very realistic fakes of anything 10-20 years ago, using basic tools. Just ask the UFO nuts or the NSFW media enthusiasts. And like what you mentioned, staged scenes have become somewhat common as well, including before the internet.

We can expect more of the same. Random unverified photo and video should not be trusted, not in 2005, not in 2015, and not today.

I believe that this "everything was fine but it's going to get really bad" narrative is just yet another attempt at regulatory capture, to outlaw open-source AI. This entire fake bridge collapse might very well be a false flag to scare senile regulators.

replies(2): >>46182505 #>>46182538 #
1. ◴[] No.46182538[source]