←back to thread

556 points campuscodi | 1 comments | | HN request time: 0.216s | source
Show context
amatecha ◴[] No.41867018[source]
I get blocked from websites with some regularity, running Firefox with strict privacy settings, "resist fingerprinting" etc. on OpenBSD. They just give a 403 Forbidden with no explanation, but it's only ever on sites fronted by CloudFlare. Good times. Seems legit.
replies(13): >>41867245 #>>41867420 #>>41867658 #>>41868030 #>>41868383 #>>41868594 #>>41869190 #>>41869439 #>>41869685 #>>41869823 #>>41871086 #>>41873407 #>>41873926 #
viraptor ◴[] No.41867420[source]
I know it's not a solution for you specifically here, but if anyone has access to the CF enterprise plan, they can report specific traffic as non-bot and hopefully improve the situation. They need to have access to the "Bot Management" feature though. It's a shitty situation, but some of us here can push back a little bit - so do it if you can.

And yes, it's sad that the "make internet work again" is behind an expensive paywall..

replies(1): >>41868257 #
meeby ◴[] No.41868257[source]
The issue here is that RSS readers are bots. Obviously perfectly sensible and useful bots, but they’re not “real people using a browser”. I doubt you could get RSS readers listed on Cloudflare’s “good bots” list either which would allow them the default bot protection feature given they’ll all run off random residential IPs.
replies(3): >>41868668 #>>41868842 #>>41872245 #
sam345 ◴[] No.41868842[source]
Not sure if I get this.It seems to me an RSS reader is as much of a bot as a browser is for HTML. It just reads RSS rather than HTML.
replies(1): >>41869616 #
1. kccqzy ◴[] No.41869616[source]
The difference is that RSS readers usually do background fetches on their own rather than waiting for a human to navigate to a page. So in theory, you could just set up a crontab (or systemd timer) that simply xdg-open various pages on a schedule and not be treated as bots.