←back to thread

Fun with uv and PEP 723

(www.cottongeeks.com)
618 points deepakjois | 10 comments | | HN request time: 0.198s | source | bottom
1. korijn ◴[] No.44370971[source]
There's no lockfile or anything with this approach right? So in a year or two all of these scripts will be broken because people didn't pin their dependencies?

I like it though. It's very convenient.

replies(3): >>44370992 #>>44370993 #>>44371436 #
2. rahimnathwani ◴[] No.44370992[source]
PEP 723 allows you to specify version numbers for direct dependencies, but of course indirect dependencies aren't guaranteed to be the same.
3. js2 ◴[] No.44370993[source]
> There's no lockfile or anything with this approach right?

There are options to both lock the dependencies and limit by date:

https://docs.astral.sh/uv/guides/scripts/#locking-dependenci...

https://docs.astral.sh/uv/guides/scripts/#improving-reproduc...

4. zahlman ◴[] No.44371436[source]
> So in a year or two all of these scripts will be broken because people didn't pin their dependencies?

People act like this happens all the time but in practice I haven't seen evidence that it's a serious problem. The Python ecosystem is not the JavaScript ecosystem.

replies(1): >>44371650 #
5. nomel ◴[] No.44371650[source]
I think it's because you don't maintain much python code, or use many third party libraries.

An easy way to prove that this is the norm is to take some existing code you have now, and update to the latest versions your dependencies are using, and watch everything break. You don't see a problem because those dependencies are using pinned/very restricted versions, to hide the frequency of the problem from you. You'll also see that, in their issue trackers, they've closed all sorts of version related bugs.

replies(1): >>44372195 #
6. zahlman ◴[] No.44372195{3}[source]
> An easy way to prove that this is the norm is to take some existing code you have now, and update to the latest versions your dependencies are using

I have done this many times and watched everything fail to break.

replies(1): >>44372559 #
7. nomel ◴[] No.44372559{4}[source]
Are you sure you’re reading what I wrote fully? Getting pip, or any of them, to ignore all version requirements, including those listed by the dependencies themselves, required modifying source, last I tried.

I’ve had to modify code this week due to changes in some popular libraries. Some recent examples are Numpy 2.0 broke most code that used numpy. They changed the c side (full interpreter crashes with trimesh) and removed/moved common functions, like array.ptp(). Scipy moved a bunch of stuff lately, and fully removed some image related things.

If you think python libraries are somehow stable in time, you just don’t use many.

replies(1): >>44372686 #
8. zahlman ◴[] No.44372686{5}[source]
... So if the installer isn't going to ignore the version requirements, and thereby install an unsupported package that causes a breakage, then there isn't a problem with "scripts being broken because people didn't pin their dependencies". The packages listed in the PEP 723 metadata get installed by an installer, which resolves the listed (unpinned) dependencies to concrete ones (including transitive dependencies), following rules specified by the packages.

I thought we were talking about situations in which following those rules still leads to a runtime fault. Which is certainly possible, but in my experience a highly overstated risk. Packages that say they will work with `foolib >= 3` will very often continue to work with foolib 4.0, and the risk that they don't is commonly-in-the-Python-world considered worth it to avoid other problems caused by specifying `foolib >=3, <4` (as described in e.g. https://iscinumpy.dev/post/bound-version-constraints/ ).

The real problem is that there isn't a good way (from the perspective of the intermediate dependency's maintainer) to update the metadata after you find out that a new version of a (further-on) dependency is incompatible. You can really only upload a new patch version (or one with a post-release segment in the version number) and hope that people haven't pinned their dependencies so strictly as to exclude the fix. (Although they shouldn't be doing that unless they also pin transitive dependencies!)

That said, the end user can add constraints to Pip's dependency resolution by just creating a constraints file and specifying it on the command line. (This was suggested as a workaround when Setuptools caused a bunch of legacy dependencies to explode - not really the same situation, though, because that's a build-time dependency for some packages that were only made available as sdists, even pure-Python ones. Ideally everyone would follow modern practice as described at https://pradyunsg.me/blog/2022/12/31/wheels-are-faster-pure-... , but sometimes the maintainers are entirely MIA.)

> Numpy 2.0 is a very recent example that broke most code that used numpy.

This is fair to note, although I haven't seen anything like a source that would objectively establish the "most" part. The ABI changes in particular are only relevant for packages that were building their own C or Fortran code against Numpy.

replies(1): >>44372750 #
9. nomel ◴[] No.44372750{6}[source]
> `foolib >= 3` will very often continue to work with foolib 4.0,

Absolute nonsense. It's industry standard that major version are widely accepted as/reserved for breaking changes. This is why you never see >= in any sane requirements list, you see `foolib == 3.*`. For anything you want to work for a reasonable amount of time, you see == 3.4.*, because deprecations often still happen within major versions, breaking all code that used those functions.

replies(1): >>44373090 #
10. zahlman ◴[] No.44373090{7}[source]
Breaking changes don't break everyone. For many projects, only a small fraction of users are broken any given time. Firefox is on version 139 (similarly Chrome and other web browsers); how many times have you had to reinstall your plugins and extensions?

For that matter, have you seen any Python unit tests written before the Pytest 8 release that were broken by it? I think even ones that I wrote in the 6.x era would still run.

For that matter, the Python 3.x bytecode changes with every minor revision and things get removed from the standard library following a deprecation schedule, etc., and there's a tendency in the ecosystem to drop support for EOL Python versions, just to not have to think about it - but tons of (non-async) new code would likely work as far back as 3.6. It's not hard to avoid the := operator or the match statement (f-strings are definitely more endemic than that).

On the flip side, you can never really be sure what will break someone. Semver is an ideal, not reality (https://hynek.me/articles/semver-will-not-save-you).

And lots of projects are on calver anyway.