←back to thread

103 points voxadam | 1 comments | | HN request time: 0.202s | source
Show context
pedrozieg ◴[] No.46215560[source]
CVE counts are such a good example of “what’s easy to measure becomes the metric”. The moment Linux became a CNA and started issuing its own CVEs at scale, it was inevitable that dashboards would start showing “Linux #1 in vulnerabilities” without realizing that what changed was the paperwork, not suddenly worse code. A mature process with maintainers who actually file CVEs for real bugs looks “less secure” than a project that quietly ships fixes and never bothers with the bureaucracy.

If Greg ends up documenting the tooling and workflow in detail, I hope people copy it rather than the vanity scoring. For anyone running Linux in production, the useful question is “how do I consume linux-cve-announce and map it to my kernels and threat model”, not “is the CVE counter going up”. Treat CVEs like a structured changelog feed, not a leaderboard.

replies(3): >>46217577 #>>46217767 #>>46219692 #
1. im3w1l ◴[] No.46217577[source]
Well consider this: Two projects with the same amount of actual security issues. one project is willing to say "this bug doesn't affect security" and is willing to take accountability for that statement. Another project is not willing to do so. As a result the former has a lower count and the other a higher count. Which is better for a user valuing security?

As the actual number of issues is the same you might say it doesn't matter, but I don't agree. As a user it is easier to deal with "here are the n issues", than "here are m things any n of which are real".