←back to thread

1208 points jamesberthoty | 8 comments | | HN request time: 0.746s | source | bottom
1. pragma_x ◴[] No.45267685[source]
So, other packaging environments have a tendency to slow down the rate of change that enters the user's system. Partly through the labor of re-packaging other people's software, but also as a deliberate effort. For instance: Ubuntu or RedHat.

Is anyone doing this in a "security as a service" fashion for JavaScript packages? I imagine a kind of package escrow/repository that only serves known secure packages, and actively removes known vulnerable ones.

replies(2): >>45267827 #>>45273776 #
2. kilobaud ◴[] No.45267827[source]
I've worked in companies that do this internally, e.g., managed pull-through caches implemented via tools like Artifactory, or home-grown "trusted supply chain" automation, i.e., policy enforcement during CI/CD prior to actually consuming a third-party dependency.

But what you describe is an interesting idea I hadn't encountered before! I assume such a thing would have lower adoption within a relatively fast-moving ecosystem like Node.js though.

The closest thing I can think of (and this isn't strictly what you described) is reliance on dependabot, snyk, CodeQL, etc which if anything probably contributes to change management fatigue that erodes careful review.

replies(3): >>45268007 #>>45270099 #>>45278533 #
3. kjok ◴[] No.45268007[source]
> managed pull-through caches implemented via tools like Artifactory

This is why package malware creates news, but enterprises mirroring package registries do not get affected. Building a mirroring solution will be pricey though mainly due to high egress bandwidth cost from Cloud providers.

4. tom1337 ◴[] No.45270099[source]
How does a pull-through cache prevent this issue? Wouldn’t it also just pull the infected version from the upstream registry?
replies(1): >>45278395 #
5. arccy ◴[] No.45273776[source]
Google has Assured Open Source for Python / Java https://cloud.google.com/security/products/assured-open-sour...

Some other vendors do AI scanning

I doubt anyone would want to touch js packages with manual review.

replies(1): >>45278369 #
6. pragma_x ◴[] No.45278369[source]
It would take labor, that's for sure. Manual review of everything JS is just too massive a landscape to cover. Automation is the way to go here, for sure.

I think the bare minimum is heavy use of auditjs (or Snyk, or anything else that works this way), and maybe a mandatory waiting period (2-4 weeks?) before allowing new packages in. That should help wave off the brunt of package churn and give auditjs enough time to catch up to new package vulnerabilities. The key is to not wait too long so folks can address CVE's in their software, but also not be 100% at the bleeding edge.

7. pragma_x ◴[] No.45278395{3}[source]
I think it's implied that packages can be blocked and/or evicted from said cache administratively. This deliberately breaks builds, and forces engineers to upgrade/downgrade away from bad packages as needed.
8. pragma_x ◴[] No.45278533[source]
Exactly. Everyone is doing this, maybe well, maybe poorly. Consider Sonatype Nexus and its "repository firewall" product. Their business model _depends_ on everyone not cooperating, so there's likely a ton of folks that would love to pay less to get the same results.

> The closest thing I can think of (and this isn't strictly what you described) is reliance on dependabot, snyk, CodeQL, etc which if anything probably contributes to change management fatigue that erodes careful review.

It's not glamorous work, that's for sure. And yes, it would have to rely heavily on automated scanning to close the gap on the absolutely monstrous scale that npmjs.org operates at. Such a team would be the Internet's DevOps in this one specific way, with all the slog and grind that comes with that. But not all heroes wear capes.