Capping at major version level is just common sense, IMO. It almost shouldn't be even possible to leave the major version out when declaring package dependencies. A new version might as well be an entirely unrelated library/application from the viewpoint of compatibility, so it might as well be part of the name of the package. Not unlike how the python major version is part of the name of the interpreter.

Major versions going EOL'd and unmaintained is unfortunate, but that's not a purely technical problem. Releasing a new major version and breaking compatibility with existing users is as much a social decision as a technical one.

I'm not sold on the "semver doesn't work anyway" angle here either, although I admit that it's not perfect.

This whole problem is because in python you can not specify interfaces and say "I need a function called f that takes an x of type T and returns a U", so instead you encode that in an indirect way, like "I know version x.y.z works so I'll just require that".

Any other way risks runtime errors. And to people about to mention types in python: those are also checked at runtime.

People keep using these hyper dynamic languages and then running into these robustness issues and scaling limitations brought on by their very dynamism. It makes me mad and sad.

The most sane thing I've found is:

- Pin all application versions - Don't pin or set upper bounds in libraries. Lower bounds may work. - Use automation to continuously upgrade and test new versions of everything

If you just pin, you fall behind and eventually it becomes expensive to catch up. If you don't pin, you lose repeatability. If you don't automate, the upgrade work doesn't happen reliably.

I agree, I think upper bound constraints go against what is commonly accepted and used in the Python ecosystem. What I try to do on my projects now is to always have a nightly CI test step, in theory if an updated package breaks my package it will be caught fairly quickly
This was previously posted by same user in 2022: https://news.ycombinator.com/item?id=29507681

And by another user 54 days ago: https://news.ycombinator.com/item?id=39486552

The general rule I use is that libraries should not specify upper bounds on dependencies but applications should.

I use Poetry for all my projects, but I agree that it exacerbates the issue somewhat with its default npm-style version syntax.

I struggled to refind this post when searching on hn.algolia.com so I'll write a comment that'll help others find it: This very long article describes the pitfalls of capping dependencies in Python libraries. It's specific about Python, it discusses the bad precedent that poetry is setting.
GNU programs have been doing this for decades w.r.t. Autoconf and Automake versions (which you run into when you need to "boostrap" them to work on them rather than just build them as a downstream user).
Of course you should. But only when necessary.