←back to thread

597 points classichasclass | 3 comments | | HN request time: 0.726s | source
1. herbst ◴[] No.45011011[source]
More than half of my traffic is Bing, Claude and for whatever reason the Facebook bots.

None of these are main main traffic drivers, just the main resource hogs. And the main reason when my site turns slow (usually an AI, microsoft or Facebook ignoring any common sense)

China and co is only a very small portion of my malicious traffic. Gladly. It's usually US companies who disrespect my robots.txt and DNS rate limits who make me the most problems.

replies(1): >>45011070 #
2. devoutsalsa ◴[] No.45011070[source]
There are a lot of dumb questions, and I pose all of them to Claude. There's no infrastructure in place for this, but I would support some business model where LLM-of-choice compensates website operators for resources consumed by my super dumb questions. Like how content creators get paid when I watch with a YouTube Premium subscription. I doubt this is practical in practice.
replies(1): >>45012139 #
3. herbst ◴[] No.45012139[source]
For me it looks more like out of the control bots than average requests. For example a few days ago I blocked a few bots. Google was about 600 requests in 24 hours. Bing 1500, Facebook is mostly blocked right now, Claude with 3 different bot types was about 100k requests in the same time.

There is no reason to query all my sub-sites, it's like a search engine with way to many theoretical pages.

Facebook also did aggressively, daily indexing of way to many pages, using large IP ranges until I blocked it. I get like one user per week from them, no idea what they want.

And bing, I learned, "simply" needs hard enforced rate limits it kinda learns to agree on.