←back to thread

211 points CrankyBear | 1 comments | | HN request time: 0.203s | source
Show context
thaumaturgy ◴[] No.45107225[source]
People outside of a really small sysadmin niche really don't grasp the scale of this problem.

I run a small-but-growing boutique hosting infrastructure for agency clients. The AI bot crawler problem recently got severe enough that I couldn't just ignore it anymore.

I'm stuck between, on one end, crawlers from companies that absolutely have the engineering talent and resources to do things right but still aren't, and on the other end, resource-heavy WordPress installations where the client was told it was a build-it-and-forget-it kind of thing. I can't police their robots.txt files; meanwhile, each page load can take a full 1s round trip (most of that spent in MySQL), there are about 6 different pretty aggressive AI bots, and occasionally they'll get stuck on some site's product variants or categories pages and start hitting it at a 1r/s rate.

There's an invisible caching layer that does a pretty nice job with images and the like, so it's not really a bandwidth problem. The bots aren't even requesting images and other page resources very often; they're just doing tons and tons of page requests, and each of those is tying up a DB somewhere.

Cumulatively, it is close to having a site get Slashdotted every single day.

I finally started filtering out most bot and crawler traffic at nginx, before it gets passed off to a WP container. I spent a fair bit of time sampling traffic from logs, and at a rough guess, I'd say maybe 5% of web traffic is currently coming from actual humans. It's insane.

I've just wrapped up the first round of work for this problem, but that's just buying a little time. Now, I've gotta put together an IP intelligence system, because clearly these companies aren't gonna take "403" for an answer.

replies(5): >>45107483 #>>45107586 #>>45108498 #>>45109192 #>>45110318 #
AnthonyMouse ◴[] No.45109192[source]
> meanwhile, each page load can take a full 1s round trip (most of that spent in MySQL)

Can't these responses still be cached by a reverse proxy as long as the user isn't logged in, which the bots presumably aren't?

replies(2): >>45109799 #>>45109827 #
1. thaumaturgy ◴[] No.45109799[source]
That would be nice! This doesn't work reliably enough for WP sites. Whether it's devs making changes and testing them in prod, or dynamic content loaded in identical URLs, my past attempts to cache html have caused questions and complaints. The current caching strategy hits a nice balance and hasn't bothered anyone, with the significant downside that it's vulnerable to bot traffic.

(If you choose to read this as, "WordPress is awful, don't use WordPress", I won't argue with you.)