←back to thread

272 points abdisalan | 1 comments | | HN request time: 0.215s | source
Show context
RadiozRadioz ◴[] No.42183900[source]
I call this phenomenon "node rot". Judging by the comments here, it seems like a universal experience.

My favorite is the way that Python projects rot. Not only does Python's setuptools give you all the fun that node-gyp does, the common practice of versioning packages with packagename>=1.25.5 means you're almost guaranteed breakages as pip installs newer versions of packages than what the project was built with.

replies(4): >>42184392 #>>42185344 #>>42191174 #>>42201978 #
FireInsight ◴[] No.42201978[source]
Oh, one of the worst forms of torture is definitely trying to get a random Python AI project from GitHub running locally. There's almost always a conflict between versions Python, Cuda, Pytorch, and a hodgepodge of pip and conda packages. Publishing a requirements.txt is the bare miminum everybody usually does, but that's usually not enough to reconstruct the environment. The ecosystem should just standardize to using declaratively prebuilt container environments or something.

Granted, my experience is mostly from the GPT-2 era, so I'm not sure if it's still this painful.

replies(1): >>42213882 #
1. phatskat ◴[] No.42213882[source]
Don’t know if this would help your case or not, but jart’s llamafile seems like it would be useful

[6] https://github.com/Mozilla-Ocho/llamafile