It's the only argument I can think of, something like Go would be goated for this use case in principle.
Re-running `cargo install <crate>` will do that. Or install `cargo-update`, then you can bulk update everything.
And it works hella better than using pip in a global python install (you really want pipx/uvx if you're installing python utilities globally).
IIRC you can install Go stuff with `go install`, dunno if you can update via that tho.
A single, pre-compiled binary is convenient for the user's first install only.
Its not.
Its convenient for CIs, for deployment, for packaging, for running multiple versions. It's extremely simple to update (just replace the binary with another one).
Now, e.g. "just replacing one file with another" may not have convenience commands like "npm update". But its not hard.
My point is that a pre-compiled binary is extremely more convenient for *everyone involved in the delivery pipeline* including the end-user. Especially for delivering updates.
As someone who's packaged Javascript(node), Ruby, Go and rust tools in .debs, snap, rpms: packaging against a dynamic runtime (node, ruby, rvm etc) is a giant PIAS that will break on a significant amount of users' machines, and will probably break on everyones machine at some point. Whereas packaging that binary is as simple as it can get: most such packages need only one dependency that everyone and his dog already has: libc.
(Aside from the fact that allowing "use pip" completely defeats the purpose of any other of these mechanisms, so it's a poster-child example of security-theater)
Just `wget -O ~/.local/bin/gemini-cli https://ci.example.com/assets/latest/gemini-cli` (Or the CURL version thereof) It can pick the file off github, some CI's assets, a package repo, a simple FTP server, an HTTP fileserver, over SSH, from a local cache, etc. It's so simple that one doesn't need a package manager. So there commonly is no package manager.
Yet in this tread people are complaining that "a single binary" is hard to manage/update/install because there's no package manager to do that with. It's not there, because the manage/update/install is so simple, that you don't need a package manager!
You might not know the reason ppl use package managers. Installing this "simple" way make it quite difficult to update and remove compared to using package managers. And although they are also "simple", it's quite a mess to manage packages manually in place of using such battle-tested systems
People use package managers for the following:
- to manage dependencies - to update stuff to a specific version or the latest version - to downgrade stuff - to install stuff - to remove stuff
any of these, except for the dependency management, are a single command, or easy to do manually, with a single compiled binary. They are so simple that they can easily be built into the tool. Or handled by your OSs package manager. Or with a "shell script" that the vendor can provide (instead of, or next to, the precompiled binary.
I did not say manually, you infer that. But I never meant that. The contrary: because it's so simple, automating that, or have your distro, OS or package manager do this for you, is trivial. As opposed to that awful "curl example.com/install.sh | sudo tee -" or those horrible built-in updaters (that always start nagging when I open the app - the one moment that I don't want to be bothered by updates because I need the app now)
The only reason one would then need a package manager is to manage dependencies. But a precompiled binary like Go's or Rusts typically are statically compiled so they have no (or at most one) dependency.
Imagine the ease of a single ".targz" or so that includes the correct python version, all pips, all ENV vars, config files, and is executable. If you distribute that - what do you still need pip for? If you distribute that, how simple would turning it into a .deb, snap, dmg, flatpack, appimg, brew package, etc be? (Answer: a lot easier than doing this for the "directory of .py files. A LOT)
pip is there so you don't need to do that. In the deployment world, you really want one version per system for everything and know that everything is in sync. To get that the solution was a distribution of software and a tool to manage them. We then extended that to programming language ecosystem and pip is part of the result.
But for workstation, a lot of people wants the latest, so the next solution was to be able to abstract the programming language ecosystem from the distribution (And you may not have a choice in the case of macOS), so what we get is directory-restricted interactions (go, npm,..) or doing shell magic so that the tooling think it's the system (virtual env,...).
It's a neat trick, but the only reason to do so is if you want to distribute compiled version of a software to customer. But if the user have access to the code, It's better to adapt the software to the system (repositories, flatpak...) or build a system around it (VM, containers, ...).
The easiest is running "sudo apt update && sudo apt upgrade" and have my whole system updated. Instead of writing some script to get it done from some github's releases page and hoping that it's not hijacked.
Having a sensible project is what make it easy down the line (including not depending on gnu libc if not needed as some people uses musl). And I believe it's easy to setup a repository if your code is proprietary (Just need to support the most likely distribution, like ubuntu, fedora, suse's tumbleweed,...)