One pain point I've really felt recently with Python is in the deploy step. pip installing dependencies with a requirements.txt file seems to be the recommended way, but it's far from easy. Many dependencies (such as PyTables) don't install their own dependencies automatically, so you are left with a fragile one-by-one process for getting library code in place.
It was all fine once I got it worked out but it would so much nicer to provide a requirements.txt file and have pip figure out the ordering and dependency stuff. That and being able to install binary packages in a simple way from pip would make me much happier with python (no more waiting 10 minutes for numpy or whatever to compile).
As far as actual language features go however, I still find python a joy to work in.
Yeah the situation with Python is kind of crap to begin with. The main problem with requirements.txt is that it doesn't lock the SHAsums of the packages being depended on.
The Python ecosystem also doesn't seem to have a sensible solution for dependency declaration in general. It's kind of remarkable that anything works at all in Python land, it's a lot of "let's hope it works often enough". To be fair it does work out a lot of the time, but when that fails you end up coming to people like me.
I should do some research and see if I can figure out the best way to do this in Python, but I don't use Python at all personally or professionally.
FWIW, there are Python package managers that don't install dependencies globally, such as Poetry (and before that Pipenv, although I really wouldn't recommend it). It's a shame most people (and projects) still use requirements.txt, though.
python's packaging, especially compared to state of the art which cargo represents, is really, really poor. documenting direct dependencies is just a start. requirements.txt is somehow simplistic and complex at the same time (remember about the special options which thankfully aren't used too much) and you can also list dependencies in setup.py, too.
there's the Pipfile initiative which gives me much hope, though :)
Yeah, the current state of Python is the worst. Especially in machine learning, it's common that a project only supports one of conda or pip installation, which are often but not always compatible. The idea of storing all dependencies in "requirements.txt" is a good idea but has not standardized across the Python community. IMO all the 2 -> 3 upgrade difficulties have slowed down the development of the Python tooling ecosystem.
Python dependency management is a hard problem, but better than most languages [citation needed]. And `pip` and `setup.py` have emerged over several years, with several influences merged in (remember distutils?).
Honestly, I wish you'd picked a different tag-line though (riffing on `requests` no doubt). Unlike `requests`, your solution only works for a subset of deployment situations, because - as already pointed out - `setup.py` and `requirements.txt` are for different things.
One of the best examples for this I've seen is to use both to deploy to a server with no internet connectivity. For development the dependencies are installed from `setup.py`. Then, before deploying, all dependencies are downloaded via `pip download`. Put the dependencies on the server, finally, use `requirements.txt` with `--no-index` and `--find-links` to install. Definitely an interesting setup, but needs must. Unfortunately, your solution doesn't support `--no-index`, `--find-links` and a few others.
You may want to have a look at tools like `pbr` (Python Build Reasonableness) [1], which has an interesting way of dealing with some hard problems. It also shows how to use `setup_requires` so you don't have to have `requirements.py` hanging around in your repo.
Take my upvote. This has helped us a ton. So nice that it resolves dependencies. Only issue we're running into is that we don't use it to manage our dependencies for our internal packages (only using it at the application level). I've been advocating we change so that we simply read in the generated requirements.txt/requirements-dev.txt in setup.py
The big win seems to be the lock file - like Cargo in Rust, or yarn in the JS world. It's really, really hard to lock down dependencies reliably in Python, especially when you are talking dependencies of the primary packages you are installing (and their dependencies, ...).
One solution at the moment is to run 'pip freeze' and put that in as your requirements file, but that very much feels like an 'and now I have a different problem' solution.
It's not crazy at all. You use requirements.txt to keep a track of the dependencies needed for development, and you put the dependencies needed to build and install a package into setup.py where the packaging script can get it.
These are two different things, because they do two different jobs.
Personally i've never really seen the advantages any of these tools over having a requirements.txt, or slightly fancier and more modern, using a pyproject.toml file.
e.g. in a recent project the entire package configuration is this pyproject.toml file:
[project]
name = "vera"
version = "0.0.0"
requires-python = ">=3.10"
authors = [
{ name = "Doxin" }
]
dependencies = ["moderngl==5.7.3", "pysdl2==0.9.14", etc etc]
[project.optional-dependencies]
dev = ["pre-commit==2.20.0"]
[tool.setuptools]
packages = ["vera"]
Requirements feels like a dirty hack but it does work fine. It has ==, ~=, and >= for version numbers, as well as allowing you to flag dependencies for different target os, etc. And then you can add setup.py if you need custom steps. But yes, it feels dirty to maintain requirements.txt, requirements-dev.txt, etc.
Poetry is the most common solution that I've seen in the wild. You spec everything using pyproject.toml and then "poetry install" and it will manage a venv of its own. But you still need to tell people to "pip install poetry" as step 0 which is annoying.
If you don't care about deploying python files, and rather just the final product, I'd recommend either nuitka or pyinstaller. These are for bundling your project into an executable without a python runtime needed (--onefile type of options for single file output). Neither supports cross compilation though.
Re: Dependencies - There are at least two well known, well supported, rock solid ways of managing dependencies that are in very common use in the python deployment world.
Some combination of requirements.txt (which lets you dial in, with great precision, each of the libraries you need, and is trivially created in < 50msec with "pip freeze")
1. Containers - That's it. You control everything in it.
2. virtualenv - Every environment comes their own version of python and their own set of packages and versions of those packages. Add virtualenvwrappers and you can create/switch trivially between them.
It's been at least 2 years since I've run into a issue with python and dependencies that wasn't solved by both of those approaches.
As someone new to using Python professionally after having used it here and there over the course of 15+ years, I’ve run into exactly this problem. It’s pretty standard for a language these days to bundle the dependency manager and build tooling. Python still does this via shell infection. And since there’s 5 different ways to do it it can leave someone trying to figure out what the right vibe is in 2023 spending hours reading about the pros and cons of everything. And all that just to land back on venv+pip+requirements.txt.
Python needs a cargo. Is Poetry it? I’ve been meaning to try it…
I felt the same way about only needing requirements.txt until I managed a project that needed to be compatible across many versions of python and some of it's dependencies were renamed from one version to another. I highly recommend you take a look at the hypermodern python guide https://cjolowicz.github.io/posts/hypermodern-python-01-setu...
I feel like this a pretty non-issue so long as you document both that the dependency is required, and how to install it. Just is much easier to install than Python!
It was all fine once I got it worked out but it would so much nicer to provide a requirements.txt file and have pip figure out the ordering and dependency stuff. That and being able to install binary packages in a simple way from pip would make me much happier with python (no more waiting 10 minutes for numpy or whatever to compile).
As far as actual language features go however, I still find python a joy to work in.
reply