←back to thread

1208 points jamesberthoty | 1 comments | | HN request time: 0.352s | source
Show context
pragma_x ◴[] No.45267685[source]
So, other packaging environments have a tendency to slow down the rate of change that enters the user's system. Partly through the labor of re-packaging other people's software, but also as a deliberate effort. For instance: Ubuntu or RedHat.

Is anyone doing this in a "security as a service" fashion for JavaScript packages? I imagine a kind of package escrow/repository that only serves known secure packages, and actively removes known vulnerable ones.

replies(2): >>45267827 #>>45273776 #
kilobaud ◴[] No.45267827[source]
I've worked in companies that do this internally, e.g., managed pull-through caches implemented via tools like Artifactory, or home-grown "trusted supply chain" automation, i.e., policy enforcement during CI/CD prior to actually consuming a third-party dependency.

But what you describe is an interesting idea I hadn't encountered before! I assume such a thing would have lower adoption within a relatively fast-moving ecosystem like Node.js though.

The closest thing I can think of (and this isn't strictly what you described) is reliance on dependabot, snyk, CodeQL, etc which if anything probably contributes to change management fatigue that erodes careful review.

replies(3): >>45268007 #>>45270099 #>>45278533 #
tom1337 ◴[] No.45270099[source]
How does a pull-through cache prevent this issue? Wouldn’t it also just pull the infected version from the upstream registry?
replies(1): >>45278395 #
1. pragma_x ◴[] No.45278395[source]
I think it's implied that packages can be blocked and/or evicted from said cache administratively. This deliberately breaks builds, and forces engineers to upgrade/downgrade away from bad packages as needed.