←back to thread

1208 points jamesberthoty | 1 comments | | HN request time: 0.96s | source
Show context
foxfired ◴[] No.45268501[source]
My problem is that, in the JS ecosystem, every single time you go through a CI/CD pipeline, you redownload everything. We should only download the first time and with the versions that are known to work. When we make a manual update to version, than only that should be downloaded once more.

I just checked one of our repos right now and it has a 981 packages. It's not even realistic to vet the packages or to know which one is compromised. 99% of them are dependencies of dependencies. Where do we even get started?

replies(3): >>45268537 #>>45268574 #>>45268604 #
1. homebrewer ◴[] No.45268604[source]
Generate builder images and reuse them. It shaves minutes off each CI job with projects I'm working on and is non-optional because we're far from all major datacenters.

Or setup a caching proxy, whatever is easier for your org. I've had good experience with nexus previously, it's pretty heavy but very configurable, can introduce delays for new versions and check public vulnerability databases for you.

It's purely an efficiency problem though, nothing to do with security, which is covered by lock files.