←back to thread

278 points jwilk | 1 comments | | HN request time: 0.207s | source
Show context
arp242 ◴[] No.44382233[source]
A lot of these "security bugs" are not really "security bugs" in the first place. Denial of service is not resulting in people's bank accounts being emptied or nude selfies being spread all over the internet.

Things like "panics on certain content" like [1] or [2] are "security bugs" now. By that standard anything that fixes a potential panic is a "security bug". I've probably fixed hundreds if not thousands of "security bugs" in my career by that standard.

Barely qualifies as a "security bug" yet it's rated as "6.2 Moderate" and "7.5 HIGH". To say nothing of gazillion "high severity" "regular expression DoS" nonsense and whatnot.

And the worst part is all of this makes it so much harder to find actual high-severity issues. It's not harmless spam.

[1]: https://github.com/gomarkdown/markdown/security/advisories/G...

[2]: https://rustsec.org/advisories/RUSTSEC-2024-0373.html

replies(13): >>44382268 #>>44382299 #>>44382855 #>>44384066 #>>44384368 #>>44384421 #>>44384513 #>>44384791 #>>44385347 #>>44385556 #>>44389612 #>>44390124 #>>44390292 #
viraptor ◴[] No.44382855[source]
> Denial of service is not resulting in ...

DoS results in whatever the system happens to do. It may well result in bad things happening, for example stopping AV from scanning new files, breaking rate limiting systems to allow faster scanning, hogging all resources on a shared system for yourself, etc. It's rarely a security issue in isolation, but libraries are never used in isolation.

replies(2): >>44383029 #>>44383134 #
bastawhiz ◴[] No.44383134[source]
An AV system stopping because of a bug in a library is bad, but that's not because the library has a security bug. It's a security problem because the system itself does security. It would be wild if any bug that leads to a crash or a memory leak was a "security" bug because the library might have been used by someone somewhere in a context that has security implications.

A bug in a library that does rate limiting arguably is a security issue because the library itself promises to protect against abuse. But if I make a library for running Lua in redis that ends up getting used by a rate limiting package, and my tool crashes when the input contains emoji, that's not a security issue in my library if the rate limiting library allows emails with punycode emoji in them.

"Hogging all of the resources on a shared system" isn't a security bug, it's just a bug. Maybe an expensive one, but hogging the CPU or filling up a disk doesn't mean the system is insecure, just unavailable.

The argument that downtime or runaway resource use due is considered a security issue but only if the problem is in someone else's code is some Big Brained CTO way of passing the buck onto open source software. If it was true, Postgres autovacuuming due to unpleasant default configuration would be up there with Heartbleed.

Maybe we need a better way of alerting downstream users of packages when important bugs are fixed. But jamming these into CVEs and giving them severities above 5 is just alert noise and makes it confusing to understand what issues an organization should actually care about and fix. How do I know that the quadratic time regexp in a string formatting library used in my logging code is even going to matter? Is it more important than a bug in the URL parsing code of my linter? It's impossible to say because that responsibility was passed all the way downstream to the end user. Every single person needs to make decisions about what to upgrade and when, which is an outrageous status quo.

replies(3): >>44383193 #>>44383817 #>>44384248 #
lmeyerov ◴[] No.44383817[source]
Traditional security follows the CIA triad: Confidentiality (ex: data leaks), Integrity (ex: data deletion), and Availability (ex: site down). Something like SOC2 compliance typically has you define where you are on these, for example

Does availability not matter to you? Great. For others, maybe it does, like you are some medical device segfaulting or OOMing in an unmanaged way on a cfg upload is not good. 'Availability' is a pretty common security concern for maybe 40 years now from an industry view.

replies(4): >>44383876 #>>44384078 #>>44384219 #>>44385894 #
1. ◴[] No.44385894[source]