Victor Miti
import this – a blog about python & more

Follow

import this – a blog about python & more

Follow
Python dependency management using pip-tools

Python dependency management using pip-tools

Victor Miti's photo
Victor Miti
·Jun 4, 2021·

6 min read

pip has been around for quite some time (since 2008) as python's default package-management system. In fact, Python 2 >=2.7.9 and Python 3 >=3.4 ship with pip already included.

In the recent past, the Python community has witnessed the emergence of newer package-management systems in an attempt to mitigate some of pip's shortcomings, and also to bring to Python some cool features typical of other package-management systems such us Javascript's npm and PHP's Composer. Pipenv and Poetry are popular pip alternatives. Actually, the Python Packaging Authority (PyPA) recommends Pipenv to manage library dependencies when developing Python applications. They still provide suggestions for alternatives, if Pipenv doesn't work well for you. pip-tools is one such alternative, and it is, at the time of writing this post, my preferred package-management system for Python.

Well, I'm not going to make a comparison of the available package-management systems in Python. Such comparisons have already been made by others (for example this one from modelpredict.com). I will simply highlight my own experience, and how pip-tools works well for me.

Now, for a long time, I have relied on pip to manage my python dependencies. However, I started getting frustrated (the kind of frustrations Kenneth Reitz describes in this short post on his blog) with the standard pip workflow. I started looking for answers, and after hearing so much about Pipenv, I decided to give it a try. It was like love at first sight! The key features I instantly fell in love with are:

  • the use of a Pipfile and Pipfile.lock to separate abstract dependency declarations from the last tested combination.
  • the fact that Pipenv also automagically managed my Python virtual environment.

Well, this love didn't last long, because soon I noticed that it took too long to download, install and lock dependencies. Apparently, I'm not the only one who faced this problem (see, for example issue #2873 on GitHub, issue #4430 and this StackOverflow question). I therefore decided that Pipenv wasn't for me, and started looking for an alternative. I often make software adoption decisions based on how popular the software is. Based on this, poetry would have been the next step. However, when I had an initial quick glance at it, I noticed that it was completely different from what my brain was accustomed to, and I was not willing to invest time and effort into learning something completely new, and potentially having a heartbreak later, as was the case with Pipenv. So I skipped poetry, and decided to try something else.

One of the things that drew me to pip-tools, just from looking at the README on the project's repo, is that it builds up on top of pip, so I still get to use some pip options and the good old requirements.txt, which I was already familiar with. The only new thing was the use of requirements.in and learning two new commands: pip-compile and pip-sync, that's it! I jumped onto the pip-tools bandwagon and I've never looked back! I've been using it over a year now, and I don't intend to abandon it anytime soon.

So, what does my typical Python development workflow with pip-tools look like? Well, whenever I start a new project,

  1. I create a virtual environment (I use virtualenvwrapper for this) – mkvirtualenv name_of_env.
  2. Update pip (pip install --upgrade pip) and install pip-tools (pip install pip-tools)
  3. Create a requirements.in file at the project root, and specify my dependencies. A minimal example for, say, a Wagtail project, is shown below.
  4. Next, I run pip-compile requirements.in, which compiles a requirements.txt file from my dependencies.
  5. Finally, I run pip-sync, which updates my virtual environment to reflect exactly what's in requirements.txt.
### Here's an example of requirements.in ###

# core
django>=3.1,<4.0
wagtail>=2.12,<3.0
django-compressor
django-debug-toolbar
django-environ
django-extensions
django-intl-tel-input
django-leaflet
django-model-utils
django-recaptcha
django-user-agents
psycopg2
wagtailfontawesome
werkzeug

# dev
black
bpython
commitizen
doc8
flake8
isort[requirements_deprecated_finder]
pip-tools
powerline-status
pre-commit

# test
fake-useragent
faker-e164
pytest-cov
pytest-django
pytest-dotenv
pytest-factoryboy
pytest-logger
pytest-mock
pytest-sugar
pytest-xdist
wagtail-factories @ https://github.com/wagtail/wagtail-factories/archive/master.zip

Both files are committed into my version control, which makes it easy to track dependencies throughout the lifecycle of the project. If I need to add another dependency, I'll simply add it manually in my requirements.in and run pip-compile requirements.in followed by pip-sync.

If pip-compile finds an existing requirements.txt file that fulfils the dependencies then no changes will be made, even if updates are available.

To force pip-compile to update all packages in an existing requirements.txt, run pip-compile --upgrade.

To update a specific package to the latest or a specific version use the --upgrade-package or -P flag.

One of the things I particularly like about pip-tools is that the generated requirements.txt file shows you "where each package came from". What do I mean? Well, pip-tools itself has click and pep517 as its underlying dependencies, so these also get installed whenever you install pip-tools. Your requirements.txt file will indicate that these are pip-tools' dependencies. Here's an example requirements.txt file generated from a requirements.in file with pip-tools and commitizen specified as dependencies:

#
# This file is autogenerated by pip-compile
# To update, run:
#
#    pip-compile requirements.in
#
argcomplete==1.12.3
    # via commitizen
click==8.0.1
    # via pip-tools
colorama==0.4.4
    # via commitizen
commitizen==2.17.8
    # via -r requirements.in
decli==0.5.2
    # via commitizen
jinja2==2.11.3
    # via commitizen
markupsafe==2.0.1
    # via jinja2
packaging==20.9
    # via commitizen
pep517==0.10.0
    # via pip-tools
pip-tools==6.1.0
    # via -r requirements.in
prompt-toolkit==3.0.18
    # via questionary
pyparsing==2.4.7
    # via packaging
pyyaml==5.4.1
    # via commitizen
questionary==1.9.0
    # via commitizen
termcolor==1.1.0
    # via commitizen
toml==0.10.2
    # via pep517
tomlkit==0.7.2
    # via commitizen
wcwidth==0.2.5
    # via prompt-toolkit

# The following packages are considered to be unsafe in a requirements file:
# pip

I think this is super cool! Would probably come in handy in a situation where you introduce a new dependency and it breaks your project.

Something to watch out for when using pip-tools is when your project is used across different Python environments (say multiple Operating Systems or multiple Python versions). While the same requirements.in can be used as the source file for all environments, you have to run pip-compile on each Python environment separately to generate a requirements.txt valid for that particular environment. Therefore, for each Python environment, you might wanna use the {env}-requirements.txt format (for example: win32-py3.7-requirements.txt, ubuntu20.04-py3.6-requirements.txt, etc.)

Well, this is how I use pip-tools to manage dependencies in my Python projects. If you've never used it before, I'd suggest you give it a try, and see if it's a good fit for you. Also be sure to check out the README on GitHub, there are a couple more features that I haven't talked about (simply because I haven't used them yet!) that you might find interesting.


Background image by Fakurian Design on Unsplash

 
Share this