My good uv experience. I tried installing tensor/cuda Python code recently. Plain pip just failed. uv pip actually returned WHY it failed.
It definitely felt like magic.
I am interested in how they're going to make money eventually, but right now it's working for me.
Does anyone have an idea about how they're going to monetize?
With that said — it’s uv or die for me
Among many things it’s improved, scripting with python finally just works without the pain of some odd env issue.
Before uv I was doing everything in a devcontainer on my Mac since that was easiest, but uv is super fast that I skip that unless I have some native libraries that I need for Linux.
I wish python can provide an "official" solution to each problem (like in rust, there's cargo, end of story), or at lease, an official document describing the current best practice to do things.
For the last year or so, I've been trying to provide an alternative guide that stays abreast the best options and provides simple guides: https://pydevtools.com/.
E.g.
uv init --script foo.py
uv add --script foo.py httpx
cat foo.py
...
dependencies = ['httpx']
...
Then on another machine: uv run foo.py
# creates a virtual env, reads foo.py to see httpx is a dependency, installs in the ephemeral venv then runs the script
The above is from memory typed on a phone so maybe some minor syntax issues but the point i tried to make was we can kinda emulate the convenience of statically compiled binaries a-la Go these daysThat said, the people left in the CPython team generally have a low regard for bloat-free, correct and fast solutions, so external solutions are most welcome.
---
time uv
real 0m0.005s
user 0m0.000s
sys 0m0.004s
---
time npm
real 0m0.082s
user 0m0.068s
sys 0m0.020s
---
time pip
real 0m0.320s
user 0m0.179s
sys 0m0.031s
By rule, you should never meddle with the globally installed python because so many packages will try to look for the system installed Python and use it, better let your package manager handle it.
They spend a lot of time on improving Python itself and then you have pip which is a way to install packages and that's it; it's not a package manager nor a python version manager.
From what I can tell uv doesn’t (unlike poetry) assist with venvs what so ever.
What is a trivial «poetry run» becomes the same venv-horrors of Python fame when I use uv and «uv run».
Based on that, your comment strikes me as the polar opposite of my experience (which is why I still resort to poetry).
Care to outline how you use v to solve venv-issues, since from what I can tell, uv explicitly doesn’t?
I’m very curious.
Here’s a couple links to discussions about it on HN:
First, you can move that script to a different machine and do `uv run {script}`, no need to recreate a venv or provide install instructions (I believe uv will now even grab an appropriate version of Python if you don't have it?). This comes from PEP 723, and multiple tools support doing this, such as hatch.
Second, when you "add" a requirement instead of "install" a requirement it manages that with the knowledge of all requirements that were added before. For example, if I `pip install foo` and then `pip install bar` pip does not consider foo or it's dependencies as required when installing bar, so it's possible that you can break `foo` by installing completely incompatible dependencies. But when you "add foo" and then "add bar" from uv (and other tools that are declarative, like Poetry) your environment gets updated to take everything into account.
If managing Python dependencies is second nature to you then these might seem like extra concepts to keep in your head, but lots of people do find these useful because they find they can think less about Python dependencies.
And how does that work on Windows, which to my knowledge doesn’t even support shebangs?
As such they do not currently support C extensions, nor running arbitrary code during the build process. I imagine they will add features slowly over time, but with the continued philosophy of the simple and common cases should be zero configuration.
For Python experts who don't have special needs from a build backend I would recommend flit_core, simplest and most stable build backend, or hatching, very stable and with lots of features. While uv_build is great, it does mean that users building (but not installing) your project need to be able to run native code, rather than pure Python. But this is a pretty small edge case that for most people it won't be an issue.
Before (analogous to go mod init):
python -m venv venv
source venv/bin/activate
python -m pip install -U pip
pip install httpx
pip freeze > requirements.txt
nvim foo.py
# find a way to share foo.py and requirements.txt
On another machine (still the before scenario, this time analogous to maybe go run): python -m venv venv
source venv/bin/activate
python -m pip install -U pip
pip install -r requirements.txt
python foo.py
In the after scenario: uv run foo.py
That's it. Comparable to ./my-go-binary
I never learned python the way I wanted to because for years I would first look at the excruciating transition from v2 to v3 and just not see a point of entry for a newb like me.
Now the same thing is happening with tooling for v3. pip? pepenv? python pip? python3 pip? I don't freakin' know. Now there's uv, and I'm kinda excited to try again.
That said, I've checked Anaconda's site, and while it used to be "Anaconda [Python] Commercial Distribution", "On-Prem repositories", "Cloud notebooks and training"... during the last year they've changed their product name to "Anaconda AI Platform", and all it's about "The operating system for AI", "Tools for the Complete AI Lifecycle". Eeeeh, no thanks.
Currently, the default build backend for uv init is hatchling. This will change to uv in a future version.
makes it seem like it's not yet stable, or at least feels like they're still not encouraging it.(And also so they'll implement the `pip download` functionality I'd like!)
They don’t. That’s a sign that the local system is severely broken, and should be rebuilt to be stable. uv will still work in that case, but you’re going to constantly hit other points of friction on a mismanaged system which will waste time.
Personally I can't think of anything from Go's build system I miss now - the languages are very different for sure, but I guess we're talking about the build system only.
Want to profile your go? pprof built in (to be fair python has had cProfile forever but the Go version is more convenient to read the output).
Want to run some tests, or better yet some benchmarks? A good take on the problem space is just built in. You can safely go with the default and don't need to spend mental tax credits on selecting the best benchmarking lib from the ecosystem.
Stuff like go fmt is just taken for granted but even in the python world, there are still some non-black (and compatibles like ruff) flavoured formatters floating around - probably the most common on GH even today in Python is no formatter.
Can go on and on - go generate (maybe a tiny bit less relevant with generics being available today?), go tool, go vet, ...