Google getting dynamically generated content (ajax) is newish
(I don't think it's new enough to be "news").
They've been doing it for years.
More interesting is if Google can "parse" things like <noscript> or
if the CSS hides content, or if text is too small and decide to
rank based on that (contributing factors to a "quality" rating).
Signs point to Googlebot using a browser, so it doesn't need to "parse" things, per se. It just loads the page and checks what happens.
Google specifically fighting "black hat" SEO techniques is really,
extremely old news.
I have a slang dictionary website. It showed thousands of citations of slang use from major publications, TV shows, movies, etc. Google penalized the site because of that.[1]
I wanted to show the citations because they're a major way that I differentiate my site from other slang dictionaries like Urban Dictionary. So I used AJAX to load the citations, hoping that Google wouldn't index that and penalize the site.
I was wrong - but that doesn't mean what I was doing was black hat. It was the opposite.
[1] I'm fairly confident that Google penalized the site in part because of the citations, which was performed algorithmically. But it may also be the case that former Google employee, and head of the web spam team, Matt Cutts played a primary role in manually - and permanently - penalizing the site.