←back to thread

81 points Bogdanas | 5 comments | | HN request time: 0.694s | source
1. fideloper ◴[] No.9975059[source]
Not sure who the audience is here. If the text is in the DOM, it's gonna get indexed. Computer's don't care if text is white on a white background.

That being said, Google getting dynamically generated content (ajax) is newish (I don't think it's new enough to be "news").

More interesting is if Google can "parse" things like <noscript> or if the CSS hides content, or if text is too small and decide to rank based on that (contributing factors to a "quality" rating).

I'd guess "yeah, probably". After all, they can rank based on if your site is mobile-friendly, and that must take some interesting metrics to decide.

Google specifically fighting "black hat" SEO techniques is really, extremely old news. Google being good at indexing "all the things" - also old news.

replies(3): >>9975263 #>>9975377 #>>9977372 #
2. jdiez17 ◴[] No.9975263[source]
Computers might not care if text is white on white, but humans do. So just because the text is in the DOM doesn't necessarily mean it'll get indexed. It's definitely in Google's interest to fight against techniques like these. They already mitigate against things like keyword spam, etc.
3. juliangregorian ◴[] No.9975377[source]
Google definitely penalizes keyword stuffing via white text/small text/etc. and have done so for years and years.
replies(1): >>9975432 #
4. ◴[] No.9975432[source]
5. WalterGR ◴[] No.9977372[source]

    Google getting dynamically generated content (ajax) is newish
    (I don't think it's new enough to be "news").
They've been doing it for years.

    More interesting is if Google can "parse" things like <noscript> or
    if the CSS hides content, or if text is too small and decide to
    rank based on that (contributing factors to a "quality" rating).
Signs point to Googlebot using a browser, so it doesn't need to "parse" things, per se. It just loads the page and checks what happens.

    Google specifically fighting "black hat" SEO techniques is really,
    extremely old news. 
I have a slang dictionary website. It showed thousands of citations of slang use from major publications, TV shows, movies, etc. Google penalized the site because of that.[1]

I wanted to show the citations because they're a major way that I differentiate my site from other slang dictionaries like Urban Dictionary. So I used AJAX to load the citations, hoping that Google wouldn't index that and penalize the site.

I was wrong - but that doesn't mean what I was doing was black hat. It was the opposite.

[1] I'm fairly confident that Google penalized the site in part because of the citations, which was performed algorithmically. But it may also be the case that former Google employee, and head of the web spam team, Matt Cutts played a primary role in manually - and permanently - penalizing the site.