←back to thread

263 points josephcsible | 1 comments | | HN request time: 0.233s | source
Show context
mr_windfrog ◴[] No.46178827[source]
What this incident really shows is the growing gap between how easy it is to create a convincing warning and how costly it is to verify what's actually happening. Hoaxes aren't new, but generative tools make fabrication almost free and massively increase the volume.

The rail operator didn't do anything wrong. After an earthquake and a realistic-looking image, the only responsible action is to treat it as potentially real and inspect the track.

This wasn't catastrophic, but it's a preview of a world where a single person can cheaply trigger high-cost responses. The systems we build will have to adapt, not by ignoring social media reports, but by developing faster, more resilient ways to distinguish signal from noise.

replies(9): >>46178848 #>>46178936 #>>46179699 #>>46180135 #>>46180154 #>>46180642 #>>46180686 #>>46181129 #>>46185243 #
1. intended ◴[] No.46180135[source]
Not a hope.

Most economic value arises from distinguishing signal from noise. All of science is distinguishing signal from noise.

Its valuable, because it is hard. It is also slow - the only way to verify something is often to have reports from someone who IS there.

The conflict arises not from verifying the easy things - searching under the illumination of street lights. Its verifying if you have a weird disease, or if people are alive in a disaster, or what is actually going on in a distant zone.

Verification is laborious. In essence, the universe is not going to open up its secrets to us, unless the effort is put in.

Content generation on the other hand, is story telling. It serves other utility functions to consumers - fulfilling emotional needs for example.

As the ratio of content to information keeps growing, or the ratio of content to verification capacity grows - we will grow increasingly overwhelmed by the situation.