Slight pushback on this. The web has been spammed with subpar tutorials for ages now. The kind of medium "articles" that are nothing more than "getting started" steps + slop that got popular circa 2017-2019 is imo worse than the listy-boldy-emojy-filled articles that the LLMs come up with. So nothing gained, nothing lost imo. You still have to learn how to skim and get signals quickly.
I'd actually argue that now it's easier to winnow the slop. I can point my cc running in a devcontainer to a "tutorial" or lib / git repo and say something like "implement this as an example covering x and y, success condition is this and that, I want it to work like this, etc.", and come back and see if it works. It's like a litmus test of a tutorial/approach/repo. Can my cc understand it? Then it'll be worth my time looking into it. If it can't, well, find a different one.
I think we're seeing the "low hanging fruit" of slop right now, and there's an overcorrection of attitude against "AI". But I also see that I get more and more workflows working for me, more or less tailored, more or less adapted for me and my uses. That's cool. And it's powered by the same underlying tech.
Now, we can argue that a typical SEO-optimized garbage article is not better, but I feel like the trust score for them was lower on average from a typical person.
"here's the type of message that the author of this page is trying to convey" is not what most people think is a simple question
It's also not the question I asked. I'm literally trying to parse out what question was asked. That's what makes AI slop so infuriating: it's entirely orthogonal to the information I'm after. Asking Google "what is the flag for persevering Metadata using scp" and getting the flag
name instead of a SEO article with the a misleading title go on about so third party program
that you can download that does exactly that and never actually tell you the answer is
ridiculous and I am happy AI has help reduce the click bait
Except that the AI slop Google and Microsoft and DDG use for summaries masks whether or not a result is SEO nonsense. Instead of using excerpts of the page the AI summary simply suggests that the SEO garbage is answering the question you asked. These bullshit AI summaries make it infinitely harder to parse out what's actually useful. I suppose that's the goal though. Hide that most of the results are low quality and force you to click through to more pages (ad views) to find something relevant. AI slop changes the summaries from "garbage in, garbage out" to simply "garbage out".Sure, web search companies moved away from direct keyword matching to much more complex "semantics-adjacent" matching algorithms. But we don't have the counterfactual keyword-based Google search algorithm from 2000 on data from 2025 to claim that it's just search getting worse, or the problem simply getting much harder over time and Google failing to keep up with it.
In light of that, I'm much more inclined to believe that it's SEO spam becoming an industry that killed web search instead of companies "nerfing their own search engines".
"SEO" is not some magic, it is "compliance with ranking rules of the search engine". Google wanted to make their lives easier, implemented heuristics ranking slop higher, resulting in two things happening simultaneously: information to slop ratio decreasing AND information getting buried deeper and deeper within SRPs.
> do you have direct evidence that Google actively made search worse?
https://support.google.com/google-ads/answer/10286719?hl=en-... Google is literally rewriting the queries. Not only results with better potential for ads outrank more organic results, it is impossible to instruct the search engine to not show you storefronts even if you tried.
sure. https://www.wheresyoured.at/the-men-who-killed-google/
>These emails — which I encourage you to look up — tell a dramatic story about how Google’s finance and advertising teams, led by Raghavan with the blessing of CEO Sundar Pichai, actively worked to make Google worse to make the company more money. This is what I mean when I talk about the Rot Economy — the illogical, product-destroying mindset that turns the products you love into torturous, frustrating quasi-tools that require you to fight the company’s intentions to get the service you want.
Of course, it's hard to "objectively" prove that they literally made search worse, but it's clear they were fine with stagnating in order to maximize ad revenue.
I see it as the same way Tinder works if you want the mentality. There's a point where being "optimal" hurts your bottom line, so you don't desire achieving a perfect algorithm. Meanwhile, it can be so bad for Google that directly searching for a blog title at times can leave me unsuccessful.
Yes, in the case of Google:
- They make more money from ads if the organic results are not as good (especially if it's not clear they're add)
- They get more impressions if you don't find the answer at the first search and have to try a different query
So the time you're talking about is a window when Google existed, but before they gave up on fighting spam.
If we somehow paid directly for search, then Google's incentives would be to make search good so that we'd be happy customers and come back again, rather than find devious ways to show us more ads.
Most people put up with the current search experience because they'd rather have "free" than "good" and we see this attitude in all sorts of other markets as well, where we pay for cheap products that fail over and over rather than paying once (but more) for something good, or we trade our personal information and privacy for a discount.