-
Notifications
You must be signed in to change notification settings - Fork 932
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: relax dependencies in pyproject.toml and add a requirements.txt #2669
base: main
Are you sure you want to change the base?
Conversation
@microsoft-github-policy-service agree [company="Preset Inc."] |
@@ -0,0 +1,8 @@ | |||
# This file was autogenerated by uv via the following command: | |||
# uv pip compile pyproject.toml -o requirements.txt | |||
greenlet==3.1.1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What I'm worried about here is that when e.g. greenlet 3.1.2 comes out, which includes e.g. a breaking change or a bug for some reason, our CI would still run with 3.1.1 and not install the bad version.
Why is the requirements.txt file needed at all in this case? To provide reproducible builds?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
our CI would still run with 3.1.1
Right, this is currently the case. But now you can enable @dependabot, who will auto-submit PRs periodically as new versions of dependencies come out. Those PRs will run your CI, so you can confirm that bumping is safe.
To provide reproducible builds?
Yes, so you get reproducible builds, and you let people who use your library choose their version within the specified range in pyproject.toml . Say in our case for Apache Superset, we have multiple other dependencies that rely on greenlet, each one specifying a know version range.
About the topic of maintaining the range, clearly you'll be running-CI only against what's in the pinned file. If/when people report something like "playwright doesn't work on top of greenlet>=5.0.0", as a maintainer you typically want to go and edit the known working range in pyproject.toml. Meaning we'd change the rule to greenlet>=3.1.1,<5.0.0
then, and dependabot would respect that rule, only submitting PRs to bump within the specified range.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah lets do something like: greenlet>=3.1.1,<4.0.0
. For pyee something similar, so its only 12.x for now and we can increase if there is a new version at some point. Just to make sure 13.x does not break us. Typing extensions is backed by the Python team, I think usually you don't depend on a max version there..
@@ -13,8 +13,8 @@ license = {text = "Apache-2.0"} | |||
dynamic = ["version"] | |||
requires-python = ">=3.9" | |||
dependencies = [ | |||
"greenlet==3.1.1", | |||
"pyee==12.1.1", | |||
"pyee>=12.1.1", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Make sure to relax the version in the meta.yaml
as well for our conda customers.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
mmmmh yeah it's not great for this to not be DRY. I'm not familiar enough with conda to know the best practices on that side of the fence. Here's a related GPT thread discussing the different options/pros/cons -> https://chatgpt.com/share/67476ce2-0288-8010-bc05-e20ff72d15c0
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't like the fact conda doesn't seem to be compatible with dependabot. Is it a requirement to support conda? Seems it's fading https://theregister.com/2024/08/08/anaconda_puts_the_squeeze_on/
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I'd prefer to not drop it for now. Updating it manually on conda-side is good enough for us.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mistercrunch could sync you branch with main, please?
This branch is out-of-date with the base branch.
7c562c1
to
69df9d6
Compare
@mistercrunch please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information.
Contributor License AgreementContribution License AgreementThis Contribution License Agreement (“Agreement”) is agreed to by the party signing below (“You”),
|
@A1exKH I just rebased the PR |
@@ -20,3 +20,4 @@ service_identity==24.2.0 | |||
twisted==24.11.0 | |||
types-pyOpenSSL==24.1.0.20240722 | |||
types-requests==2.32.0.20241016 | |||
uv==0.5.4 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd prefer to not introduce a new package for this. What do we loose if we don't use it? afaik its only used to update the command in the requirements.txt
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I needed something to compile the requirements, it's either this uv
or pip-tools
or maybe poetry
(?). Unless conda
has options to do this but I think conda
is slowing down, and uv
is the future of package management in python, replacing/improving-on/consolidating pip
, virtualenv
, pyenv
, conda
, ...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need to 'compile' the requirements? Maybe I miss something. I was thinking about updating them manually (or dependabot does it) - since these are only two - we should be able to deal with it? They get automatically installed using pip install -r requirements.txt
anyways?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see - so both dependencies: pyee
and greenlet
have no dependencies on itself - this is where I got confused. uv helps us to pin the transitive dependencies (where there aren't any yet). In the future there might be some. Is this correct?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for being slow! I think once the feedback is applied we can merge it. Please also adjust meta.yml so it has the same ranges as pyproject.toml. Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mistercrunch please, sync your branch with main branch to solve this problem:
This branch is out-of-date with the base branch
Would suggest disabling that GH feature forcing to rebase, isn't scalable, forces rebasing all PRs on every merge... But happy to rebase |
69df9d6
to
a893a3c
Compare
Related to this: #2666
uv
as it seems it's what the cool kids are using nowadays.github/workflows/
and modified so that it would install the pinned filesNOTE:
local-requirements.txt
file and addeduv
to it, though I'd recommend moving tooptional-dependencies
in pyproject.toml, maybe as unpinned there, and generate a requirements-dev.txt or similar from there. Though not super important