Now, when the website needs to not only contain content, but also be its own advertisement, writing it in a way that will maximize virality is the natural course of action to make sure the site actually gets seen.
This will likely be true until a method of finding webpages that is not based on automated scraping or the page itself.
The "fiddle with H1" or "write X amount of words" or "buy Y number of links with a % of anchor text" is silly.
Semantic HTML has been created to help screen readers and browsers understand content organization, it having been hijacked by SE is just a side-effect.
e.g. Google always had problems indexing Flash websites. It historically had issues with sites heavily relying on Javascript. Nowadays it's less of a problem, at least for Googlebot.
I mean it's pretty reasonable, if a site has been around a long time it's going to be generally 'good'.
It's the same thing as the tweaks you have to perform for SEO optimisation, some have questionable value to the end user but you jump through the hoops anyway because it's what is done, by pleasing the robots you're rewarded with a higher search position.
The only way would be to keep finding links like Wiki Game and hoping to get closer to the intended target. Luckily there are huge robots who have done this for you and can tell you which links lead to your destination.
The result is that ACTUALLY USEFUL articles are buried on page 5. Any slightly helpful bit of content in the top articles are repeated (using different grammar of course) in all the other "top" articles.