←back to thread

1208 points jamesberthoty | 3 comments | | HN request time: 0.002s | source
Show context
kelnos ◴[] No.45266878[source]
As a user of npm-hosted packages in my own projects, I'm not really sure what to do to protect myself. It's not feasible for me to audit every single one of my dependencies, and every one of my dependencies' dependencies, and so on. Even if I had the time to do that, I'm not a typescript/javascript expert, and I'm certain there are a lot of obfuscated things that an attacker could do that I wouldn't realize was embedded malware.

One thing I was thinking of was sort of a "delayed" mode to updating my own dependencies. The idea is that when I want to update my dependencies, instead of updating to the absolute latest version available of everything, it updates to versions that were released no more than some configurable amount of time ago. As a maintainer, I could decide that a package that's been out in the wild for at least 6 weeks is less likely to have unnoticed malware in it than one that was released just yesterday.

Obviously this is not a perfect fix, as there's no guarantee that the delay time I specify is enough for any particular package. And I'd want the tool to present me with options sometimes: e.g. if my current version of a dep has a vulnerability, and the fix for it came out a few days ago, I might choose to update to it (better eliminate the known vulnerability than refuse to update for fear of an unknown one) rather than wait until it's older than my threshold.

replies(35): >>45266995 #>>45267024 #>>45267360 #>>45267489 #>>45267600 #>>45267697 #>>45267722 #>>45267967 #>>45268218 #>>45268503 #>>45268654 #>>45268764 #>>45269143 #>>45269397 #>>45269398 #>>45269524 #>>45269799 #>>45269945 #>>45270082 #>>45270083 #>>45270420 #>>45270708 #>>45270917 #>>45270938 #>>45272063 #>>45272548 #>>45273074 #>>45273291 #>>45273321 #>>45273387 #>>45273513 #>>45273935 #>>45274324 #>>45275452 #>>45277692 #
wvh ◴[] No.45273513[source]
As a security guy, for years, you get laughed out of the room suggesting devs limit their dependencies and don't download half of the internet while building. You are an obstruction for making profit. And obviously reading the code does very little since modern (and especially Javascript) code just glues together frameworks and libraries, and there's no way a single human being is going to read a couple million lines of code.

There are no real solutions to the problem, except for reducing exposure somewhat by limiting yourself to a mostly frozen subset of packages that are hopefully vetted more stringently by more people.

replies(9): >>45273591 #>>45274145 #>>45274168 #>>45274297 #>>45275495 #>>45275734 #>>45276496 #>>45277631 #>>45279275 #
999900000999 ◴[] No.45274297[source]
The "solution" would be using a language with a strong standard library and then having a trusted 3rd party manually audit any approved packages.

THEN use artifactory on top of that.

That's boring and slow though. Whatever I want my packages and I want them now. Apart of the issue is the whole industry is built upon goodwill and hope.

Some 19 year old hacked together a new front end framework last week, better use it in prod because why not.

Occasionally I want to turn off my brain and just buy some shoes. The Timberland website made that nearly impossible last week. When I gave up on logging in for free shipping and just paid full price, I get an email a few days later saying they ran out of shoes.

Alright. I guess Amazon is dominant for a reason.

replies(5): >>45274427 #>>45274782 #>>45274799 #>>45275228 #>>45279075 #
silverliver ◴[] No.45274427[source]
This is the right answer. I'm willing to stick my head out and assert that languages with a "minimal" standard library are defective by design. The argument of APIs being stuck is mood with approaches like Rust's epocs or "strict mode".

Standard libraries should include everything needed to interact with modern systems. This means HTTP parsing, HTTP requests, and JSON parsing. Some laguages are excellent (like python), while some are half way there (like go), and some are just broken (Rust).

External libraries are for niche or specialized functionality. External libraries are not for functionality that is used by most modern software. To put your head in the ground and insist otherwise is madness and will lead to ridiculous outcomes like this.

replies(7): >>45275250 #>>45275311 #>>45275318 #>>45275441 #>>45275539 #>>45276844 #>>45277579 #
999900000999 ◴[] No.45276844[source]
It's not an easy problem to solve.

Doing it the right way would create friction, developers might need to actually understand what the code is doing rather than pulling in random libraries.

Try explaining to your CTO that development will slow down to verify the entire dependency chain.

I'm more thinking C# or Java. If Microsoft or Oracle is providing a library you can hope it's safe.

You *could* have a development ecosystem called Safe C# which only comes with vetted libraries and doesn't allow anything else.

I'm sure other solutions already exist though.

replies(2): >>45277609 #>>45278764 #
1. pjmlp ◴[] No.45277609{3}[source]
Why?

This is a standard practice in most places I have worked, CI/CD only allowed to use internal repos, and libraries are only added after clearance.

replies(1): >>45278265 #
2. brazzy ◴[] No.45278265[source]
Except that "clearance" invariably consists of bureaucratic rubber stamping and actually decreases security by making it harder and slower to fix newly discovered vulnerabilities.
replies(1): >>45278308 #
3. pjmlp ◴[] No.45278308[source]
Depends on the skills of the respective DevOps security team.

There are also tools that break CI/CD based on CVE reports from existing dependencies.