←back to thread

642 points scalewithlee | 7 comments | | HN request time: 0.617s | source | bottom
1. netsharc ◴[] No.43793866[source]
Huh, a bit like "adult-content" filters that would censor Scunthorpe or Wikipedia articles about genitals, maybe Cloudflare saw a market to sell protection for donkeys who can't protect their webapps from getting request-injected.
2. tester756 ◴[] No.43793890[source]
Who knows how many attacks such a "stupid" thing blocks every month?
3. eli ◴[] No.43793904[source]
You figured all that out just because the headers indicate the site passed through Cloudflare at one point? That's quite a leap!

If Cloudflare had a default rule that made it impossible to write that string on any site with their WAF, wouldn't this be a lot more widespread? Much more likely someone entered a bad rule into Cloudflare, or Cloudflare isn't involved in that rule at all.

4. ◴[] No.43794025[source]
5. rainforest ◴[] No.43794057[source]
I think Cloudflare WAF is a good product compared to other WAFs - by definition a WAF is intended to layer on validation that properly built applications should be doing, so it's sort of expected that it would reject valid potentially harmful content.

I think you can fairly criticise WAF products and the people who advocate for them (and created the need for them) but I don't think the CF team responsible can really be singled out.

6. kccqzy ◴[] No.43794219[source]
Unfortunately this is probably a case where the market demands stupidity. The quality engineers don't have a say over market forces.
7. arnaudsm ◴[] No.43794539[source]
These WAF features are older than LLMs & vibe coding.