Correction -- sadly, we're already well within this era
Slight pushback on this. The web has been spammed with subpar tutorials for ages now. The kind of medium "articles" that are nothing more than "getting started" steps + slop that got popular circa 2017-2019 is imo worse than the listy-boldy-emojy-filled articles that the LLMs come up with. So nothing gained, nothing lost imo. You still have to learn how to skim and get signals quickly.
I'd actually argue that now it's easier to winnow the slop. I can point my cc running in a devcontainer to a "tutorial" or lib / git repo and say something like "implement this as an example covering x and y, success condition is this and that, I want it to work like this, etc.", and come back and see if it works. It's like a litmus test of a tutorial/approach/repo. Can my cc understand it? Then it'll be worth my time looking into it. If it can't, well, find a different one.
I think we're seeing the "low hanging fruit" of slop right now, and there's an overcorrection of attitude against "AI". But I also see that I get more and more workflows working for me, more or less tailored, more or less adapted for me and my uses. That's cool. And it's powered by the same underlying tech.
The problem is that AI makes all of that far, far easier.
Even using tooling to filter articles doesn't scale as slop grows to be a larger and larger percentage of content, and it means I'm going to have to consider prompt injections and running arbitrary code. All of this is a race to the bottom of suck.
Now, we can argue that a typical SEO-optimized garbage article is not better, but I feel like the trust score for them was lower on average from a typical person.
"here's the type of message that the author of this page is trying to convey" is not what most people think is a simple question
It's also not the question I asked. I'm literally trying to parse out what question was asked. That's what makes AI slop so infuriating: it's entirely orthogonal to the information I'm after. Asking Google "what is the flag for persevering Metadata using scp" and getting the flag
name instead of a SEO article with the a misleading title go on about so third party program
that you can download that does exactly that and never actually tell you the answer is
ridiculous and I am happy AI has help reduce the click bait
Except that the AI slop Google and Microsoft and DDG use for summaries masks whether or not a result is SEO nonsense. Instead of using excerpts of the page the AI summary simply suggests that the SEO garbage is answering the question you asked. These bullshit AI summaries make it infinitely harder to parse out what's actually useful. I suppose that's the goal though. Hide that most of the results are low quality and force you to click through to more pages (ad views) to find something relevant. AI slop changes the summaries from "garbage in, garbage out" to simply "garbage out".Sure, web search companies moved away from direct keyword matching to much more complex "semantics-adjacent" matching algorithms. But we don't have the counterfactual keyword-based Google search algorithm from 2000 on data from 2025 to claim that it's just search getting worse, or the problem simply getting much harder over time and Google failing to keep up with it.
In light of that, I'm much more inclined to believe that it's SEO spam becoming an industry that killed web search instead of companies "nerfing their own search engines".
"SEO" is not some magic, it is "compliance with ranking rules of the search engine". Google wanted to make their lives easier, implemented heuristics ranking slop higher, resulting in two things happening simultaneously: information to slop ratio decreasing AND information getting buried deeper and deeper within SRPs.
> do you have direct evidence that Google actively made search worse?
https://support.google.com/google-ads/answer/10286719?hl=en-... Google is literally rewriting the queries. Not only results with better potential for ads outrank more organic results, it is impossible to instruct the search engine to not show you storefronts even if you tried.
Spam by its nature is low effort, low yields anyway. They don't particularly care about making scraps since their pipeline is nearly automated.
sure. https://www.wheresyoured.at/the-men-who-killed-google/
>These emails — which I encourage you to look up — tell a dramatic story about how Google’s finance and advertising teams, led by Raghavan with the blessing of CEO Sundar Pichai, actively worked to make Google worse to make the company more money. This is what I mean when I talk about the Rot Economy — the illogical, product-destroying mindset that turns the products you love into torturous, frustrating quasi-tools that require you to fight the company’s intentions to get the service you want.
Of course, it's hard to "objectively" prove that they literally made search worse, but it's clear they were fine with stagnating in order to maximize ad revenue.
I see it as the same way Tinder works if you want the mentality. There's a point where being "optimal" hurts your bottom line, so you don't desire achieving a perfect algorithm. Meanwhile, it can be so bad for Google that directly searching for a blog title at times can leave me unsuccessful.
>Less incentive to write small libraries. Less incentive to write small tutorials on your own website.
What weitendorf posted is definitively not a library, nor is there a small tutorial for the code.
>Unless you are a hacker or a spammer where your incentives have probably increased. We are entering the era of cheap spam of everything with little incentive for quality.
Considering the low effort to post and high effort to understand what weitendorf wrote, he might be considered a spammer given the context. The code quality is also low since his application can easily be replicated by a bunch of echo calls in a bash script, making me lean towards thinking he is a low quality spammer, given the context.
>All this for the best case outcome of most people being made unemployed and rolling the dice on society reorganising to that reality.
I'm not sure you can argue that weitendorf sufficiently addressed this. He put too much emphasis on an obvious strawman (real programmer) which is completely out of context. Nobody is questioning here whether someone is a programmer or not. There is no gatekeeping whatsoever. You're free to use LLMs.
I'll also complain about your use of "salient" here, which generally has two meanings. The first is that something is "eye catching" (making me think more of spam), the second meaning is "relevancy/importance" to a specific thing and that's where weitendorf falls completely flat.
Now you might counter and argue that he packaged all of his salient points inside the statement "want to lay off bread-and-butter red-blooded american programmers", then your position is incredibly weak, because you're deflecting from one strawman to another strawman or alternatively, your counterargument will rely heavily on reinterpretation, which again just means the point wasn't salient.
Closest thing in the YT space would be Nebula, but Nebula's scope is very narrow (by design).
Yes, in the case of Google:
- They make more money from ads if the organic results are not as good (especially if it's not clear they're add)
- They get more impressions if you don't find the answer at the first search and have to try a different query
So the time you're talking about is a window when Google existed, but before they gave up on fighting spam.
The term I like is that AI has _industrialised_ those behaviours. While native hunted buffalo, it wasn't destructive until it was industrialised [1] it that it became truly destructive.
If we somehow paid directly for search, then Google's incentives would be to make search good so that we'd be happy customers and come back again, rather than find devious ways to show us more ads.
Most people put up with the current search experience because they'd rather have "free" than "good" and we see this attitude in all sorts of other markets as well, where we pay for cheap products that fail over and over rather than paying once (but more) for something good, or we trade our personal information and privacy for a discount.
I assume GP's point was that assembly language literacy was a pointless skill nowadays. I found it quite useful, precisely because it's no longer an ubiquitous skill, so you can shine with your expertise in some situations.