Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace custom pip freeze script with pip-tools #207

Merged
merged 2 commits into from
Jan 21, 2020
Merged

Conversation

lfdebrux
Copy link
Contributor

@lfdebrux lfdebrux commented Jan 7, 2020

In the past we've avoided using out-of-the-box solutions for Python dependency resolution because a) they haven't been very mature and b) we've had lots of issues with version conflicts. See [1], [2] for details. Instead, we've been using a custom Python script that under-the-hood runs pip freeze and saves the output to requirements.txt.

This script works well for us, but it doesn't integrate well with other tools. On the other hand pip-tools as of 2020 seems to be well-supported by its maintainers and other tools; for instance, GitHub's automated update service Dependabot supports requirements.in files.

At the same time, we've been better at keeping our dependencies up-to-date and fixing version conflicts, so there is no longer any issue caused by using pip-tools to generate our requirements.txt file. I think it's time to give pip-tools another chance :)

This PR is a straw-person to explore using pip-tools instead. It replaces our script dmutils.repoutils.freeze_requirements with pip-compile, and changes our requirements-app.txt file to be requirements.in to meet pip-tools convention. There are also separate requirements-dev.in and requirements-dev.txt files for dev requirements, following the layered requirements example in the pip-tools README.

As part of this, I've also loosened/removed the constraints in the parent requirements files, so we end up with something like the spec/lockfile pattern in tools such as Bundler and NPM. This appears to be the best way of using pip-tools.

In my mind the advantages are a cleaner interface for maintaining requirements files, and better integration with tools in the future. We could also remove our own custom code! There are a couple of extra steps we could take, such as asking pip-compile to save hashes, and using pip-sync to install requirements in our local development virtualenvs.

@lfdebrux lfdebrux force-pushed the ldeb-use-pip-tools branch 2 times, most recently from 3914dd1 to ed476eb Compare January 8, 2020 09:23
@lfdebrux lfdebrux mentioned this pull request Jan 8, 2020
Copy link
Contributor

@katstevens katstevens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this proposal!

One annoying problem we had with freeze-requirements was it getting confused when trying to pull in an update to the dmutils freeze function itself. Would we run into any issues updating pip-tools?

@lfdebrux
Copy link
Contributor Author

I like this proposal!

One annoying problem we had with freeze-requirements was it getting confused when trying to pull in an update to the dmutils freeze function itself. Would we run into any issues updating pip-tools?

That's a good question, and I'm not sure of the answer. Looking through the open issues for pip-tools I can't see anything like this.

However, I do note that the README for pip-tools says under Example usage for pip-sync that

pip-sync will not upgrade or uninstall packaging tools like setuptools, pip, or pip-tools itself. Use pip install --upgrade to upgrade those packages.

which is interesting.

That said, pip-sync does follow semantic versioning, and the requirements file format is specified by pip, so I can't see that changing in a breaking way. I suspect in practice that if we do run into problems we can always delete the existing requirements.txt and regenerate it without too much hassle.

@lfdebrux
Copy link
Contributor Author

lfdebrux commented Jan 13, 2020

One thing to note, I don't know how well PyUp works with pip-tools (this open issue from 2017 seems to suggest not at all). So I think makes more sense to adopt pip-tools and Dependabot together.

Alternatively we could do what we were doing before and run pip-tools when a PyUp PR comes in (so similar to what we do before).

Copy link
Contributor

@risicle risicle left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

WFM. Only downside is that it doesn't keep the direct dependencies at the top of the file verbatim, which is something I liked.

@risicle
Copy link
Contributor

risicle commented Jan 20, 2020

Am I right in thinking we no longer have the test step that forces a re-freeze if the .in files are out of sync? Guess that's another downside.

@lfdebrux
Copy link
Contributor Author

Am I right in thinking we no longer have the test step that forces a re-freeze if the .in files are out of sync? Guess that's another downside.

That's correct. We could add it using a Make rule that looks at the timestamps of requirements.txt and requirements.in.

@risicle
Copy link
Contributor

risicle commented Jan 20, 2020

Please god no not timestamps.

@risicle
Copy link
Contributor

risicle commented Jan 21, 2020

We could write the hash of the last-frozen .in files somewhere and check it matches at make test time...

In the past we've avoided using out-of-the-box solutions for Python
dependency resolution because a) they haven't been very mature and b)
we've had lots of issues with version conflicts. See [[1]], [[2]] for
details. Instead, we've been using a custom Python script that
under-the-hood runs `pip freeze` and saves the output to
`requirements.txt`.

This script works well for us, but it doesn't integrate well with other
tools. On the other hand [`pip-tools`](https:/jazzband/pip-tools)
as of 2020 seems to be well-supported by its maintainers and other
tools; for instance, GitHub's automated update service
[Dependabot](https://dependabot.com) supports `requirements.in` files.

At the same time, we've been better at keeping our dependencies
up-to-date and fixing version conflicts, so there is no longer any issue
caused by using `pip-tools` to generate our `requirements.txt` file. I
think it's time to give `pip-tools` another chance :)

This commit replaces our script `dmutils.repoutils.freeze_requirements`
with `pip-compile`, and changes our `requirements-app.txt` file to be
`requirements.in` to meet `pip-tools` convention. There are also
separate `requirements-dev.in` and `requirements-dev.txt` files for dev
requirements, following the [layered requirements example in the
`pip-tools` README][3].

[1]: https://gds-way.cloudapps.digital/manuals/programming-languages/python/python.html#applications
[2]: Crown-Commercial-Service/digitalmarketplace-api@95ac122
[3]: https:/jazzband/pip-tools#workflow-for-layered-requirements
@lfdebrux
Copy link
Contributor Author

We could write the hash of the last-frozen .in files somewhere and check it matches at make test time...

Is this a feature that lots of people want? @katstevens @benvand what are your thoughts?

@katstevens
Copy link
Contributor

We could write the hash of the last-frozen .in files somewhere and check it matches at make test time...

Is this a feature that lots of people want? @katstevens @benvand what are your thoughts?

Sounds good to me, but we can introduce it in a separate PR?

@risicle
Copy link
Contributor

risicle commented Jan 21, 2020

Yeah I don't think it's immediately critical.

@lfdebrux lfdebrux merged commit e716194 into master Jan 21, 2020
@lfdebrux lfdebrux deleted the ldeb-use-pip-tools branch January 21, 2020 12:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants