←back to thread

A critique of package managers

(www.gingerbill.org)
109 points gingerBill | 2 comments | | HN request time: 0s | source
Show context
hliyan ◴[] No.45168463[source]
I don't know what the solution to this problem is, but I do remember a time (around 20 years ago) when this wasn't a big problem. Was working on a fairly large (each module between 50k - 100k LOC) C++ system. The process for using libraries:

1) Have problem that feels too complicated to hand-code.

2) Go on Internet/forums, find a library. The library is usually a small, flat collection of atomic functions.

3) A senior engineer vets the library and approves it for use.

4) Download the stable version: header file, and the lib file for our platform (on rare occasions, build it from source)

5) Place the .h file in the header path, and the lib file in the lib path; update the Makefile.

6) #include the header and call functions.

7) Update deployment scripts (bash script) to scp the lib file to target environment, or in some cases, use static linking.

8) Subscribe to a mailing list and very occasionally receive news of a breaking change that requires a rebuild.

This may sound like a lot of work, but somehow, it was a lot less stressful than dealing with NPM and node_modules today.

replies(1): >>45175540 #
saulpw ◴[] No.45175540[source]
I think the main thing that makes this workable is "The library is usually a small, flat collection of atomic functions."

I find that it's the hell of transitive dependencies--you as a developer can reasonably vet a single layer of 10-30 standalone libraries. But if those libraries depend on other libraries, etc, then it balloons into hundreds or thousands of dependencies, and then you're sunk.

For what it's worth, I don't think much of this is essential complexity. Often a library is complicated because it supports 10 different ways of using it, but when you use the library, you're only using 1 of those ways. If everyone is only using 10% of thousands of transitive dependencies, the overall effect is incredibly complicated, but could have been achieved with 10-100% more short-term effort. Sure, "it took twice as long to develop but at least we don't have 10x the dependencies" is a hard sell to management (and often to ourselves), but that's because we usually choose to ignore the costs of depending on software we don't understand and don't control. We think that we're cleverly avoiding having to maintain and secure those libraries we outsourced, but most open-source developers aren't doing a great job of that anyway.

Often it really is easier to develop something from scratch, rather than learn and integrate a library. Not always though, of course.

replies(1): >>45178166 #
1718627440 ◴[] No.45178166[source]
In C and C++ you don't need the transitive dependencies for compilation, you only need the header of the direct dependencies. As for linking they are only needed when linking dynamically, which was much less prevalent 20 years ago.
replies(1): >>45184679 #
1. saulpw ◴[] No.45184679[source]
It's not about compilation, it's about interactions, and leaky abstractions.
replies(1): >>45187316 #
2. 1718627440 ◴[] No.45187316[source]
This then means that the problem is more the quality of the library itself and not the package manager/dependency resolver/build system. You can have leaky abstractions just fine when all you do is add a single binary static library with nothing else going on.