-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: filter unexpected pruns in e2e MultiplePR #1978
base: main
Are you sure you want to change the base?
fix: filter unexpected pruns in e2e MultiplePR #1978
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PR Overview
This PR fixes a failing test by filtering out PipelineRuns that do not match the expected naming pattern before asserting the count.
- Removed direct assertion on the full list of PipelineRuns.
- Added filtering of PipelineRuns based on a specific prefix before performing the count check.
Reviewed Changes
File | Description |
---|---|
test/github_pullrequest_concurrency_multiplepr_test.go | Updated test to filter PipelineRuns matching the required naming pattern prior to asserting the count |
Copilot reviewed 1 out of 1 changed files in this pull request and generated 1 comment.
0323ee8
to
8e14a3a
Compare
it was flake here as well 🥲 , triggered it again |
@chmouel this time |
hahah yeah, i don't think that works, i still wonder what's going on but that fix was def accurace sicne we had the cancel-pr in there... which got filtered |
@chmouel I've seen error message from an old failed E2E ( (prettified output using ChatGPT 🙂 )
See above there are 4 PLRs for naming pattern |
The namespace the pipelineruns are in is created uniquely for this test using a random suffix. If this pipelinerun is still around, would that suggest that if the PLR was not from this test then there was a namespace random-suffix collision? If so, may it be a good idea to increase the length of the random suffix length. |
@aThorp96 if you see namespace is deleted on every test run. |
I think i'll just say 9 or 10 Prun results is acceptable to qualify it as success, wdyt? |
@zakisk in that case then, somehow the PLR is getting kicked off for this test; the namespace is created for this test and deleted afterward. How pressing is getting this fixed? If it's time sensitive we can merge this and/or update the test to expect either 9 or 10 runs. But if time permits I think it would better to investigate and resolve how the PLR is making it into the test namespace. What do you think @chmouel? |
i'd like to see someone who would like to fix this and figuring it out, but this has hindered our velocity and waste resources, it's up to you if you think you can go to the bottom of it |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fair enough. I'll add a GH issue or Jira to take a look
at the moment 9, 10 check is fine but off course we need solution! |
We are now accepting 9 or 10 pipelineruns since the flkayness is between those two numbers. not ideal but not the worse thing compared to all the rerun wires we have to do. Test was failing due to counting unrelated PipelineRuns. Fixed by filtering results to only count PipelineRuns matching the test's naming pattern. error: ``` github_pullrequest_concurrency_multiplepr_test.go:144: assertion failed: 10 (int) != 9 (allPipelinesRunAfterCleanUp int): we should have had 9 kept after cleanup, we got 10: [no-match-ng84l [{Succeeded False {2025-03-05 13:41:06 +0000 UTC} Cancelled PipelineRun "no-match-ng84l" was cancelled}] prlongrunnning-fqmf-1-eyle-9tv54 [{Succeeded True {2025-03-05 13:41:41 +0000 UTC} Succeeded Tasks Completed: 1 (Failed: 0, Cancelled 0), Skipped: 0}] prlongrunnning-fqmf-2-smfw-xmc6j [{Succeeded True {2025-03-05 13:41:56 +0000 UTC} Succeeded Tasks Completed: 1 (Failed: 0, Cancelled 0), Skipped: 0}] prlongrunnning-fqmf-3-oowj-5tk8v [{Succeeded True {2025-03-05 13:42:11 +0000 UTC} Succeeded Tasks Completed: 1 (Failed: 0, Cancelled 0), Skipped: 0}] prlongrunnning-piwd-1-kpby-fhz29 [{Succeeded True {2025-03-05 13:40:57 +0000 UTC} Succeeded Tasks Completed: 1 (Failed: 0, Cancelled 0), Skipped: 0}] prlongrunnning-piwd-2-zmdb-587lp [{Succeeded True {2025-03-05 13:41:11 +0000 UTC} Succeeded Tasks Completed: 1 (Failed: 0, Cancelled 0), Skipped: 0}] prlongrunnning-piwd-3-zdht-7p4f6 [{Succeeded True {2025-03-05 13:41:26 +0000 UTC} Succeeded Tasks Completed: 1 (Failed: 0, Cancelled 0), Skipped: 0}] prlongrunnning-zunz-1-gctg-gxh4q [{Succeeded True {2025-03-05 13:40:12 +0000 UTC} Succeeded Tasks Completed: 1 (Failed: 0, Cancelled 0), Skipped: 0}] prlongrunnning-zunz-2-bpsb-sfvr5 [{Succeeded True {2025-03-05 13:40:26 +0000 UTC} Succeeded Tasks Completed: 1 (Failed: 0, Cancelled 0), Skipped: 0}] prlongrunnning-zunz-3-hdth-gn9gw [{Succeeded True {2025-03-05 13:40:41 +0000 UTC} Succeeded Tasks Completed: 1 (Failed: 0, Cancelled 0), Skipped: 0}]] ``` Signed-off-by: Chmouel Boudjnah <[email protected]>
8e14a3a
to
7d6bf91
Compare
Test was failing due to counting unrelated PipelineRuns. Fixed by filtering
results to only count PipelineRuns matching the test's naming pattern.
The issue is that we have 10 PipelineRuns after cleanup but expected 9.
The extra one appears to be no-match-ng84l which was cancelled and doesn't follow the naming pattern (prlongrunnning-*).
The calculation in the test is:
3 PRs × 3 PipelineRuns = 9
3 PRs × 1 retest × 3 PipelineRuns = 9
Total: 18 PipelineRuns
After maxKeepRun=1: 18/2 = 9 PipelineRuns
Since the extra PipelineRun has a completely different naming pattern and was cancelled, it's likely not part of the test scenario but either from a previous test run or created by another process.
error:
Changes
Submitter Checklist
📝 Ensure your commit message is clear and informative. Refer to the How to write a git commit message guide. Include the commit message in the PR body rather than linking to an external site (e.g., Jira ticket).
♽ Run make test lint before submitting a PR to avoid unnecessary CI processing. Consider installing pre-commit and running pre-commit install in the repository root for an efficient workflow.
✨ We use linters to maintain clean and consistent code. Run make lint before submitting a PR. Some linters offer a --fix mode, executable with make fix-linters (ensure markdownlint and golangci-lint are installed).
📖 Document any user-facing features or changes in behavior.
🧪 While 100% coverage isn't required, we encourage unit tests for code changes where possible.
🎁 If feasible, add an end-to-end test. See README for details.
🔎 Address any CI test flakiness before merging, or provide a valid reason to bypass it (e.g., token rate limitations).
If adding a provider feature, fill in the following details:
(update the documentation accordingly)