One day Google may introduce multiple search rankings, where one of them is SEO and another is the "useful things". But I don't hold my breath.
Good thing /etc/hosts has no size limit.
What about trust-based systems. You choose who you trust and get information that they found not to be SEO-garbage, like trust-rings. When the system can't do it alone, user-centric feedback may work. That could give interesting inputs besides the ones Google already gets using its standard metrics.
I suspect this is actually one of those fundamentally hard problems.
1. Old domain names bought solely for their old SEO rank.
2. Apps on mobile app stores are sold, and updates begin to include shady privacy-invading malware.
3. Old free software projects on various registries (npm etc.) are sold, with the same result as (2).
Recipes would ultimately be a list of ingredients, concise instructions and maybe a picture or two. It should be trivial to train a classifier to detect SEO spam in this context.
I think Google doesn't really have an incentive to do this, as SEO spam typically includes ads which can contain Google ads or analytics/Google Tag Manager which helps Google, thus prioritizing better results would work against their bottom line.
Otherwise, it seems really like a cat and mouse game. Another option may be to force SEO to be indistinguishable from the best content. Is that the current goal?
Maybe it's just because I'm searching for technical stuff but DDG and Google are both a big source of frustration for me,
DDG thinks I mistype most of my queries and will desperately try to correct my 'mistake' because "surely nobody is really searching for documentation about ARM32 bootloaders, they just mistyped when they were really trying to look for a webshop that sells 32 different ARMchairs and ARMy boots.".
Google will understand my input at least half of the time but uses that power to show me the power of websites that do some article/keyword scraping and run GPT on it, or this great new Medium blogpost with two paragraphs of someone copying a Wikipedia summary of what ARM is and copy pasting build instructions from a GitHub README.
I've tried searching github.com itself but that's just a nice way to find out that apparently most of the data they store is just scraped websites, input for ML models or dictionaries and they will happily show me all 9K forks of the one repo that contains the highest density of these keywords.
/rant
So, if Google altered their algorithm such that "recipe" content had to be shorter-form in order to perform better in SERPs, how would this change anything? The sites that profit from search traffic would be the ones with their fingers on the pulse of the algorithm, and the resources to instantly alter their content in order to ensure that they continued to rank for the terms that were driving traffic.