You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The kubeflow tests started failing all of a sudden. When I looked at the logs, I noticed that pip resolver is trying to resolve the environment until it finds a version of the kfp package that works:
Collecting kfp
Downloading kfp-1.8.9.tar.gz (296 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 296.3/296.3 kB 63.3 MB/s eta 0:00:00
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Downloading kfp-1.8.8.tar.gz (296 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 296.2/296.2 kB 53.7 MB/s eta 0:00:00
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Downloading kfp-1.8.7.tar.gz (295 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 295.5/295.5 kB 46.4 MB/s eta 0:00:00
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Downloading kfp-1.8.6.tar.gz (266 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 266.5/266.5 kB 51.2 MB/s eta 0:00:00
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Downloading kfp-1.8.5.tar.gz (264 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 264.3/264.3 kB 45.8 MB/s eta 0:00:00
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Downloading kfp-1.8.4.tar.gz (252 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 252.2/252.2 kB 43.4 MB/s eta 0:00:00
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Downloading kfp-1.8.3.tar.gz (249 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 249.4/249.4 kB 50.4 MB/s eta 0:00:00
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Downloading kfp-1.8.2.tar.gz (248 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 248.8/248.8 kB 47.5 MB/s eta 0:00:00
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Downloading kfp-1.8.1.tar.gz (248 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 248.5/248.5 kB 29.9 MB/s eta 0:00:00
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Downloading kfp-1.8.0.tar.gz (243 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 243.5/243.5 kB 46.1 MB/s eta 0:00:00
The problem is that our CI environment has way too many dependencies, since we need to test against all the platforms, we have to install a lot of packages, which causes the pip resolver to have a hard time since it needs to find a configuration of versions that is compatible with all the requirements that we have.
solution
we should create multiple jobs, one for testing each platform.
First, we should define extras_requires for each platform (i.e. dev-airflow, dev-kubeflow, dev-argo, dev-aws).
note: dev-aws should run aws batch and aws lambda tests.
where DEV_* are the requirements for testing such platform and DEV the generic ones. to obtain those, we essentially need to split the existing list of DEV requirements
once that's organized, we need to create new github actions. one that runs the generic tests and the one for kubeflow, one for argo, one for airflow, and one for aws and test each one with multiple versions on Python.
the tests are already more less organized, for example to run kubeflow tests:
pytest tests/kubeflow
we repeat the same for the other platforms.
now, for running the "base" tests we essentially need to run everything minus kubeflow, argo, airflow and aws, we can use the --ignore option for that
The kubeflow tests started failing all of a sudden. When I looked at the logs, I noticed that pip resolver is trying to resolve the environment until it finds a version of the kfp package that works:
(and the list keeps going)
The problem is that our CI environment has way too many dependencies, since we need to test against all the platforms, we have to install a lot of packages, which causes the pip resolver to have a hard time since it needs to find a configuration of versions that is compatible with all the requirements that we have.
solution
we should create multiple jobs, one for testing each platform.
First, we should define
extras_requires
for each platform (i.e. dev-airflow, dev-kubeflow, dev-argo, dev-aws).note: dev-aws should run aws batch and aws lambda tests.
something like this:
where
DEV_*
are the requirements for testing such platform andDEV
the generic ones. to obtain those, we essentially need to split the existing list of DEV requirementsonce that's organized, we need to create new github actions. one that runs the generic tests and the one for kubeflow, one for argo, one for airflow, and one for aws and test each one with multiple versions on Python.
the tests are already more less organized, for example to run kubeflow tests:
we repeat the same for the other platforms.
now, for running the "base" tests we essentially need to run everything minus kubeflow, argo, airflow and aws, we can use the --ignore option for that
https://docs.pytest.org/en/7.1.x/example/pythoncollection.html
The text was updated successfully, but these errors were encountered: