-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Move integration testing feedstock to tests directory #171
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #171 +/- ##
==========================================
- Coverage 95.93% 95.53% -0.40%
==========================================
Files 14 14
Lines 492 493 +1
==========================================
- Hits 472 471 -1
- Misses 20 22 +2 ☔ View full report in Codecov by Sentry. |
Currently looking at the integration tests now 👀 |
Alrighty, this is all up and working |
40f988b
to
5907220
Compare
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM 🥳 😉
I've noticed DataFlow jobs are slower than Flink jobs and so we often have some fail just b/c they are "Pending" and surpass the timeout. That's okay, we just rerun failed tests until they disappear
@@ -72,6 +78,15 @@ def test_dataflow_integration(): | |||
|
|||
# okay, time to start checking if the job is done | |||
show_job = f"gcloud dataflow jobs show {job_id} --format=json".split() | |||
show_job_errors = [ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
reviewer note: let's use this command to dump logs when job fails so we don't have to use GCP console
# dictobj runs do not generate any datasets b/c they are not recipes | ||
# so we've asserted what we can already, just move on | ||
if recipes_version_ref.endswith("dictobj"): | ||
return |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
reviewer note: this couldn't have been actually producing output for 0.10.x
so I think this is fine
pfr_version = parse_version(version("pangeo-forge-recipes")) | ||
if pfr_version >= parse_version("0.10"): | ||
recipe_version_ref = str(pfr_version) | ||
recipe_version_ref = "0.10.x" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
reviewer note:
given that integrate/unit tests only test beginning and end range of versions I made the unilateral decision to just have 0.10.x
listed in test-data which I think works fine
…m:pangeo-forge/pangeo-forge-runner into npz/feature/internal-integration-fixtures
for more information, see https://pre-commit.ci
# We expect `recipe` to be 1) a beam PTransform or 2) or a a string that leverages the | ||
# `dict_object:` see `tests/test-data/gpcp-from-gcs/feedstock-0.10.x-dictobj/meta.yaml` | ||
# as an example | ||
if isinstance(recipe, PTransform): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
note to reviewer: zero idea how unit tests for dict-object
are passing on main
b/c we need the conditional here or they fail like they do on this PR 🤷
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
another question @cisaacstern: we have a lot of special logic for dict_object
recipes where we seem to not test list recipes. Is the list version deprecated in favor of the dict version?
…m:pangeo-forge/pangeo-forge-runner into npz/feature/internal-integration-fixtures
@cisaacstern: this is finally ready to go (will rebase my other one) |
Move integration testing feedstock into this repo to avoid doing the two-phase updates where possible.
Closes #143