←back to thread

259 points pseudolus | 1 comments | | HN request time: 0.21s | source
Show context
NelsonMinar ◴[] No.42199494[source]
I'm confused about how or why this is a new policy. My memory is inside Google we were discussing this risk back in 2003, probably earlier. Search quality was on it. I just assumed they'd lost the arms race, or that the parasites' ranking was justified for other reasons that were hard to tease apart. What are they doing new now?

I think often about Mahalo, the sleazy shovel content that was spamming the web back in 2007. Google shut that down somewhat fast, although it did take several years. These days with AI and more aggressive spammers it's a losing battle. The real problem is the financial incentives that make this kind of spamming profitable in the first place.

My tiny little blog gets about 3 requests a week for someone to "pay me to run a guest article". Going rate is $50-$200 and again, my blog is tiny.

replies(10): >>42199551 #>>42199854 #>>42200207 #>>42200304 #>>42200373 #>>42200611 #>>42200832 #>>42200911 #>>42201266 #>>42204122 #
nurumaik ◴[] No.42199854[source]
Just manually review top K websites and ban such garbage?

Sometimes dumb, bruteforce and biased solution can work way better than any automation you can come up with

replies(2): >>42201694 #>>42209566 #
1. NelsonMinar ◴[] No.42209566[source]
I think that'd be a good approach. There was an idea in that time that everything had to be algorithmic, that hand-ranking or singling out individual sites was a bad idea. Ie: tweak a generic algorithm and then test what it does to top garbage sites, don't just penalize the site directly. I think that's not a bad principle but in practice it didn't seem to work well.