←back to thread

Web Bot Auth

(developers.cloudflare.com)
81 points ananddtyagi | 7 comments | | HN request time: 1.488s | source | bottom
Show context
realityfactchex ◴[] No.45056907[source]
No offense, but screw CloudFlare, screw their captchas for humans, and screw their wedging themselves between web operators and web users.

They can offer what they want for bots. But stop ruining the experience for humans first.

replies(1): >>45057258 #
tick_tock_tick ◴[] No.45057258[source]
> screw their wedging themselves between web operators and web users

Web operators choose to use them; hell they even pay Cloudflare to be between them. Seriously I just think you don't understand how bad it is to run a site without someone in-front of it.

replies(3): >>45057963 #>>45058439 #>>45059305 #
1. immibis ◴[] No.45058439[source]
They don't have to, but they're tricked into doing so. Via marketing.
replies(1): >>45058866 #
2. acdha ◴[] No.45058866[source]
I miss the 90s, too, but these days anyone who wants to deal with current levels of bot traffic is probably going to look at a service like Cloudflare as much cheaper than the amount of ops time they’d otherwise spend keeping things up and secure.
replies(1): >>45064696 #
3. immibis ◴[] No.45064696[source]
You could just, like, not make a website that takes several seconds to handle each request.

I let bots hit Gitea 2-3 times per second on a $10/month VPS, and the only actual problem was that it doesn't seem to ever delete zip snapshots, filling up the disk when enough snapshot links are clicked. So I disabled that feature by setting the snapshots folder read-only. There were no other problems. I mention Gitea because people complain about having to protect Gitea a lot, for some reason.

replies(1): >>45068023 #
4. acdha ◴[] No.45068023{3}[source]
Sure, I’ve been doing that since the 90s. I still pay for hardware and egress, and it turns out that everything has limits for the amount of traffic it can handle which bots can easily saturate. I’ve had sites which were mostly Varnish serving cached content at wire speed go down because they saturated the upstream.
replies(1): >>45069540 #
5. immibis ◴[] No.45069540{4}[source]
I hope 2-3 requests per second is not that limit, or you're fucked.
replies(2): >>45069837 #>>45070349 #
6. acdha ◴[] No.45069837{5}[source]
It’s not, but you’re off by 3+ orders of magnitude on the traffic volume and ignoring the cost of serving non-trivial responses.
7. vntok ◴[] No.45070349{5}[source]
It is on a simple WordPress install with the top 4 most used plugins, when you don't have a Caching Reverse Proxy like Cloudflare to filter bad traffic and serve fully cached pages from POP nodes located near the visitors.

The alternative, of course, is to set up a caching system server-side (like Redis), which most people who set up their WordPress blog don't have the first idea how to do in a secure way.