←back to thread

1343 points Hold-And-Modify | 1 comments | | HN request time: 0.001s | source

Hello.

Cloudflare's Browser Intergrity Check/Verification/Challenge feature used by many websites, is denying access to users of non-mainstream browsers like Pale Moon.

Users reports began on January 31:

https://forum.palemoon.org/viewtopic.php?f=3&t=32045

This situation occurs at least once a year, and there is no easy way to contact Cloudflare. Their "Submit feedback" tool yields no results. A Cloudflare Community topic was flagged as "spam" by members of that community and was promptly locked with no real solution, and no official response from Cloudflare:

https://community.cloudflare.com/t/access-denied-to-pale-moo...

Partial list of other browsers that are being denied access:

Falkon, SeaMonkey, IceCat, Basilisk.

Hacker News 2022 post about the same issue, which brought attention and had Cloudflare quickly patching the issue:

https://news.ycombinator.com/item?id=31317886

A Cloudflare product manager declared back then: "...we do not want to be in the business of saying one browser is more legitimate than another."

As of now, there is no official response from Cloudflare. Internet access is still denied by their tool.

Show context
ai-christianson ◴[] No.42954365[source]
How many of you all are running bare metal hooked right up to the internet? Is DDoS or any of that actually a super common problem?

I know it happens, but also I've run plenty of servers hooked directly to the internet (with standard *nix security precautions and hosting provider DDoS protection) and haven't had it actually be an issue.

So why run absolutely everything through Cloudflare?

replies(20): >>42954540 #>>42954566 #>>42954576 #>>42954719 #>>42954753 #>>42954770 #>>42954846 #>>42954917 #>>42954977 #>>42955107 #>>42955135 #>>42955479 #>>42956166 #>>42956201 #>>42956652 #>>42957837 #>>42958038 #>>42958248 #>>42963387 #>>42964892 #
matt_heimer ◴[] No.42954977[source]
Yes, [D]DoS is a problem. Its not uncommon for a single person with residential fiber to have more bandwidth than your small site hosted on a 1u box or VPS. Either your bandwidth is rate limited and they can denial of service your site or your bandwidth is greater but they can still cause you to go over your allocation and cause massive charges.

In the past you could ban IPs but that's not very useful anymore.

The distributed attacks tend to be AI companies that assume every site has infinite bandwidth and their crawlers tend to run out of different regions.

Even if you aren't dealing with attacks or outages, Cloudflare's caching features can save you a ton of money.

If you haven't used Cloudflare, most sites only need their free tier offering.

It's hard to say no to a free service that provides feature you need.

Source: I went over a decade hosting a site without a CDN before it became too difficult to deal with. Basically I spent 3 days straight banning ips at the hosting company level, tuning various rate limiting web server modules and even scaling the hardware to double the capacity. None of it could keep the site online 100% of the time. Within 30 mins of trying Cloudflare it was working perfectly.

replies(2): >>42955258 #>>42959421 #
Aachen ◴[] No.42959421[source]
> not uncommon for a single person with residential fiber to have more bandwidth than your small site hosted on a 1u box or VPS.

Then self host from your connection at home, don't pay for the VPS :). That's what I've been doing for over a decade now and still never saw a (D)DoS attack

50 mbps has been enough to host various websites, including one site that allows several gigabytes of file upload unauthenticated for most of the time that I self host. Must say that 100 mbps is nicer though, even if not strictly necessary. Well, more is always nicer but returns really diminish after 100 (in 2025, for my use case). Probably it's different if you host videos, a Tor relay, etc. I'm just talking normal websites

replies(1): >>42961052 #
1. lucumo ◴[] No.42961052[source]
> 50 mbps has been enough to host various websites,

Bandwidth hasn't been a limiting factor for years for me.

But generating dynamic pages can bring just enough load for it to get painful. Just this week I had to blacklist Meta's ridiculously overactive bot sending me more requests per second than all my real users do in an hour. Meta and ClaudeBot have been causing intermittent overloads for weeks now.

They now get 403s because I'm done trying to slow them down.