This seems like a hard problem to solve, the incentive to be top ranked is just so high. What could be the solution? Can AI even help? Do we need to go back to manual curation after all? I remember in the 90s there were manually curated lists of websites, something like a website directory. At this point I'd rather get recommended a list of websites from a reddit user than relying on Google's ranking.
Kagi, at least for now, is making its USP the fact that it surfaces more professional, curated results. Its algorithm is susceptible to manipulation, for sure, but unlike Google, it actually has an incentive to keep SEO garbage off the first page of results.
Part of the situation is that a company that relies on ad revenue will get gradually feel the pull of the advertisers more and more.
I'd be more worried about someone nefarious buying Kagi if it got big. Someone else would be willing to pay a whole lot of money for those eyeballs.
Indeed. And perhaps part of the issue is that there is not a single solution.
Even manual curation is ultimately based on trust. If someone's trusted list of recommendations gets popular enough, what's to stop them from "selling out," breaking that trust, to make money?
Also, the good curated stuff is typically correspondingly small and focused. But lots of people want broad results in their searches, and it's hard to imagine a person or handful of people being able to cater to all of those varying needs equally well.
Sometimes a person wants excellent narrow results (eg: academic looking for papers), other times they want broad shallow stuff, and at various other points want various other things in between.
There's a whole field of expertise, sometimes called "Library and Information Science" about organizing and making information findable, since long before computers existed. Even for them it is not a solved problem.
But the cat-and-mouse arms race that the online version has turned into makes libraries and asking a librarian for recommendations feel a lot more appealing (:
2NE1 ages
I can go through this list and see you are exactly the same, except you leave up more spam.
Just one example of thousands I've looked at.
Also if you believe this is wrong you should submit search quality feedback to kagifeedback.org
We get a lot of feedback but it is mostly for technical queries that we usually address:
https://kagifeedback.org/t/search-quality
If you provide details what went wrong in kagi results for this query (and what sites should rank #1, #2... in your opinion) we can take a look. With search quality because it is such a broad space, what does not get reported, does not get looked at and addressed.
I think you need to take a good hard look at what makes for shitty content and build some parameters off that. And if you had to go off pagerank (to begin with) I would be trying stuff like adding hidden penalties to popular CMSes, boosting reddit/HN content, and following SEO trends just to thwart them. I would categorize websites by expertise so queries related to korea do not rank pages from the hindustantimes. When you search for air filters, you should probably get that housefresh team and not forbes etc. The upvote system you have is moving in the right direction but even that will need to be fortified with anti-seoer measures.
I would try to create real EEAT standards that cannot be gamed without massive investments.
It's too late to undo the damage that the Danny Sullivans of the world have done but maybe can save something here.