←back to thread

693 points macawfish | 1 comments | | HN request time: 0.216s | source
Show context
everdrive ◴[] No.44544268[source]
Others have said this, I'm sure, but this will move past porn _quickly_. Once there is agreed-up age verification for pornography, much of the professional internet will require identity verification to do _anything_. This is one of the bigger nails in the coffin for the free internet, and this true whether or not you're happy with all the pornography out there.
replies(8): >>44544359 #>>44544369 #>>44544497 #>>44545175 #>>44545690 #>>44550491 #>>44550525 #>>44550534 #
nikanj ◴[] No.44544369[source]
And honestly, with the advent of AI spam everywhere, I'd be quite happy to visit a version of the internet where everyone is a certified real person
replies(6): >>44544510 #>>44545802 #>>44545960 #>>44546641 #>>44547850 #>>44559850 #
1. positron26 ◴[] No.44546641[source]
GAN style training is only going to get cheaper and easier. Detection will collapse to noise. Any ID runes will be mishandled and the abuse will fly under the radar. Only the space of problems where AI fundamentally can't be used, such as being at a live event, will be meaningfully resistant to AI.

Another way for it all to unfold is maybe 98% of online discourse is useless in a few years. Maybe it's useless today, but we just didn't have the tools to make it obvious by both generating and detecting it. Instead of AI filtering to weed out AI, a more likely outcome is AI filtering to weed out bad humans and our own worst contributions. Filter out incessant retorting from keyboard warriors. Analyze for obviously inconsistent deduction. Treat logical and factual mistakes like typos. Maybe AI takes us to a world where humans give up on the 97% and only 1% that is useless today gets through. The internet's top 2% is a different internet. It is the only internet that will be valuable for training data to identify and replace the 1% and converge onto the spaces that AI can't touch.

People will have to search for interactions that can't be imitated and have enough value to make it through filters. We will have to literally touch grass. All the time. Interactions that don't affect the grass we touch will vanish from the space of social media and web 2.0 services that have any reason to operate whatsoever. Heat death of the internet has a blast radius, and much of what humans occupy themselves with will turn out to be within that blast radius.

A lot of people will by definition be disappointed that the middle standard deviation of thought on any topic no longer adds anything. At least at first. There used to be a time when the only person you heard on the radio had to be somewhat better than average to be heard. We will return to that kind of media because the value of not having any expertise or first-hand experience will drop to such an immeasurable low that those voices no longer participate or appear to those using filters. Entire swaths of completely replaceable, completely redundant online "community" will just wither to dust, giving us time to touch the grass, hone the 2%, and make sense of other's 2%.

Callers on radio shows used to be interesting because people could have a tiny window into how wildly incorrect and unintelligent some people are. Pre-internet media was dominated by people who were likely slightly above average. Radio callers were something like misery porn or regular-people porn. You could sometimes hear someone with such an awful take that it made you realize that you are not in the bottom 10%. The internet has given us radio callers, all the time, all of them. They flooded Twitter, Reddit, Facebook. They trend and upvote themselves. They make YouTube channels where they talk into a camera with higher quality than commercial rigs from 2005. There is a GDP for stupidity that never existed except as the novelty object of a more legitimate channel. When we "democratized" media, it wasn't exclusively allowing in thoughts and opinions that were higher quality than "mainstream".

The frightening conclusion is possibly that we are living in a kind of heat death now. It's not the AIs that are scary. Its the humans we have platformed. The bait posts on Instagram will be out-competed. Low quality hot takes will be out-competed. Repetitive and useless comments on text forums will be out-competed. Advertising revenue, which is dependent on the idea that you are engaging with someone who will actually care about your product, will be completely disrupted. The entire machine that creates, monetizes, and foments utterly useless information flows in order to harness some of the energy will be wrecked, redundant, shut down.

Right now, people are correct that today's AI is on an adoption curve that would see more AI spam if tomorrow's AI isn't poised to filter out not just spam but a great mass of low-value human-created content. However, when we move to suppress "low quality slop" we will increasingly be filtering out low-quality humans. When making the slop higher quality so that it flies under the radar, we will be increasingly replacing and out-competing the low-quality content of the low-quality human. What remains will be of a very high deductive consistency. Anything that can be polished to a point will be. Only new information outside the reach of the AI and images of distant stars will be beyond the grasp of this convergence.

All of this is to say that the version of the internet where AI is the primary nexus of interaction via inbound and outbound filtering and generation might be the good internet we think we can have if we enact some totalitarian ID scheme to fight against slop that is currently replacing what the bottom 10% of the internet readily consumes anyway.