←back to thread

211 points CrankyBear | 3 comments | | HN request time: 0.602s | source
Show context
thaumaturgy ◴[] No.45107225[source]
People outside of a really small sysadmin niche really don't grasp the scale of this problem.

I run a small-but-growing boutique hosting infrastructure for agency clients. The AI bot crawler problem recently got severe enough that I couldn't just ignore it anymore.

I'm stuck between, on one end, crawlers from companies that absolutely have the engineering talent and resources to do things right but still aren't, and on the other end, resource-heavy WordPress installations where the client was told it was a build-it-and-forget-it kind of thing. I can't police their robots.txt files; meanwhile, each page load can take a full 1s round trip (most of that spent in MySQL), there are about 6 different pretty aggressive AI bots, and occasionally they'll get stuck on some site's product variants or categories pages and start hitting it at a 1r/s rate.

There's an invisible caching layer that does a pretty nice job with images and the like, so it's not really a bandwidth problem. The bots aren't even requesting images and other page resources very often; they're just doing tons and tons of page requests, and each of those is tying up a DB somewhere.

Cumulatively, it is close to having a site get Slashdotted every single day.

I finally started filtering out most bot and crawler traffic at nginx, before it gets passed off to a WP container. I spent a fair bit of time sampling traffic from logs, and at a rough guess, I'd say maybe 5% of web traffic is currently coming from actual humans. It's insane.

I've just wrapped up the first round of work for this problem, but that's just buying a little time. Now, I've gotta put together an IP intelligence system, because clearly these companies aren't gonna take "403" for an answer.

replies(5): >>45107483 #>>45107586 #>>45108498 #>>45109192 #>>45110318 #
1. jay_kyburz ◴[] No.45108498[source]
This is probably a dumb question, but at what point do we put a simple CAPTCHA in front of every new user that arrives at a site, then give them a cookie and start tracking requests per second from that user?

I guess its a kind of soft login required for every session?

update: you could bake it into the cookie approval dialog (joke!)

replies(1): >>45109189 #
2. thaumaturgy ◴[] No.45109189[source]
The post-AI web is already a huge mess. I'd prefer solutions that don't make it worse.

I myself browse with cookies off, sort of, most of the time, and the number of times per day that I have to click a Cloudflare checkbox or help Google classify objects from its datasets is nuts.

replies(1): >>45109233 #
3. dragonwriter ◴[] No.45109233[source]
> The post-AI web is already a huge mess.

You mean the peri-AI web? Or is AI already done and over and no longer exerting an influence?