Most active commenters

    ←back to thread

    119 points bavarianbob | 20 comments | | HN request time: 0.617s | source | bottom

    EDIT: Back online?!

    NPM discussion: https://github.com/npm/cli/issues/8203

    NPM incident: https://status.npmjs.org/incidents/hdtkrsqp134s

    Cloudflare messaging: https://www.cloudflarestatus.com/incidents/gshczn1wxh74

    GitHub issue: https://github.com/sindresorhus/camelcase/issues/114

    Anyone experiencing npm outage that's more than just the referenced camelcase package?

    1. tom_usher ◴[] No.43548817[source]
    Seems to be a change in Cloudflare's managed WAF ruleset - any site using that will have URLs containing 'camel' blocked due to the 'Apache Camel - Remote Code Execution - CVE:CVE-2025-29891' (a9ec9cf625ff42769298671d1bbcd247) rule.

    That rule can be overridden if you're having this issue on your own site.

    replies(3): >>43549123 #>>43550078 #>>43550699 #
    2. cbovis ◴[] No.43549123[source]
    Confirmed here: https://www.cloudflarestatus.com/incidents/gshczn1wxh74
    3. oncallthrow ◴[] No.43550078[source]
    WAFs are so shit
    replies(2): >>43550728 #>>43552419 #
    4. internetter ◴[] No.43550699[source]
    > any site using that will have URLs containing 'camel' blocked

    What engineer at cloudflare thought this was a good resolution?

    replies(2): >>43550780 #>>43553343 #
    5. ronsor ◴[] No.43550728[source]
    WAFs are literally "a pile of regexes can secure my insecure software"
    replies(2): >>43551360 #>>43555585 #
    6. Raed667 ◴[] No.43550780[source]
    I doubt the system is that simple. No one wrote a rule saying `if url.contains("camel") then block()` it's probably an unintended side-effect
    replies(3): >>43551463 #>>43551871 #>>43558525 #
    7. mschuster91 ◴[] No.43551360{3}[source]
    To be fair to WAFs, most are more than just a pile of regexes. Things like detecting bot traffic - be it spammers or AI scrapers - are valuable (ESPECIALLY the AI scraper detection, because unlike search engines these things have zero context recognition or respect for robots.txt and will just happily go on and ingest very heavy endpoints), and the large CDN/WAF providers can do it even better because they can spot shit like automated port scanners, Metasploit or similar skiddie tooling across all the services that use them.

    Honestly what I'd _love_ to see is AWS, GCE, Azure, Fastly, Cloudflare and Akamai band together and share information about such bad actors, compile evidence lists and file abuse reports against their ISP - or in case the ISP is a "bulletproof hoster" or certain enemy states, initiate enforcement actors like governments to get these bad ISPs disconnected from the Internet.

    replies(1): >>43553871 #
    8. keithwhor ◴[] No.43551463{3}[source]
    If this is a bet, I'll happily take the other side and give you 4:1 on it.
    replies(1): >>43551525 #
    9. dgfitz ◴[] No.43551525{4}[source]
    Me too.
    10. ycombinatrix ◴[] No.43551871{3}[source]
    Akamai has been doing precisely that for years & years...
    replies(2): >>43551942 #>>43552682 #
    11. ◴[] No.43551942{4}[source]
    12. UltraSane ◴[] No.43552419[source]
    But are they less shit than the shitty software they filter traffic for?
    13. benoau ◴[] No.43552682{4}[source]
    I think you can include advertising/privacy block lists in that vein too, although that allows for the users to locally-correct any issues.
    14. randunel ◴[] No.43553871{4}[source]
    Why would scrapes get blocked, is scrapping illegal?
    replies(2): >>43554078 #>>43554375 #
    15. eitland ◴[] No.43554078{5}[source]
    I don't know if it is, but I also don't think we are required to let dumb bots repeatedly assault or web sites if we can find a technical way to get around it.
    16. Xylakant ◴[] No.43554375{5}[source]
    It's very often not, but it's still the website owners property and if they choose so, they can show misbehaving guests the door and kindly ask to remain on the other side (aka block them). Large scale scraping puts substantial burden on web properties. I was paged the other night because someone decided it would be a great idea to throw 200 000rq/s for a few minutes at some publicly available volunteer run service.
    17. cluckindan ◴[] No.43555585{3}[source]
    They do mitigate known vulnerabilities.
    replies(1): >>43566888 #
    18. isbvhodnvemrwvn ◴[] No.43558525{3}[source]
    Judging by previous outages it was probably a poorly tested overcomplicated regex which matched to much.
    19. rcxdude ◴[] No.43566888{4}[source]
    They may mitigate known proofs of concept of vulnerabilities, and require a small amount of creativity to work around. At the cost of randomly breaking things.
    replies(1): >>43568248 #
    20. cluckindan ◴[] No.43568248{5}[source]
    That creativity takes time. WAFs are the first line of defence, buying some time for fixing the actual vulnerabilities.