←back to thread

1208 points jamesberthoty | 5 comments | | HN request time: 1.108s | source
Show context
foxfired ◴[] No.45268501[source]
My problem is that, in the JS ecosystem, every single time you go through a CI/CD pipeline, you redownload everything. We should only download the first time and with the versions that are known to work. When we make a manual update to version, than only that should be downloaded once more.

I just checked one of our repos right now and it has a 981 packages. It's not even realistic to vet the packages or to know which one is compromised. 99% of them are dependencies of dependencies. Where do we even get started?

replies(3): >>45268537 #>>45268574 #>>45268604 #
1. brw ◴[] No.45268574[source]
Isn't that what lockfiles are for? By default `npm i` downloads exactly the versions specified in your lockfile, and only resolves the latest versions matching the ranges specified in package.json if no lockfile exists. But CI/CD pipelines should definitely be using `npm ci` instead, which will only install packages from a lockfile and throws an error if it doesn't exist.
replies(1): >>45268667 #
2. touristtam ◴[] No.45268667[source]
That and pin that damn version!
replies(1): >>45269072 #
3. AndreasHae ◴[] No.45269072[source]
It’s still ridiculous to me that version pinning isn’t the default for npm.

The first thing I do for all of my projects is adding a .npmrc with save-exact=true

replies(1): >>45270773 #
4. silverwind ◴[] No.45270773{3}[source]
save-exact is mostly useless against such attacks because it only works on direct dependencies.
replies(1): >>45277561 #
5. electrotype ◴[] No.45277561{4}[source]
Why, though?