-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
General CI failure due to Azure error No hosted parallelism has been purchased or granted
#41025
Comments
No hosted parallelism has been purchased or granted. To request a free parallelism grant, please fill out the following form https://aka.ms/azpipelines-parallelism-request
No hosted parallelism has been purchased or granted. To request a free parallelism grant, please fill out the following form https://aka.ms/azpipelines-parallelism-request
No hosted parallelism has been purchased or granted
Thanks for reporting! This looks like an almost total CI failure. Maintainer @dpryan79 is aware and has filled out the form. See Gitter:
|
Some background on this error message, weird that it happens now, not 2 years ago when this was rolled out |
getting this error this morning: #47804 |
For this year's update: I submitted the request to reinstate parallelism on Friday, and an hour ago I got a reply from Azure saying the request is "completed", but I'm not seeing any successful builds yet. I asked what sort of timeline we can expect for being able to resume jobs but haven't heard back yet. |
@daler I guess it has been discussed in the past but have the Bioconda team considered using Github Actions instead of Azure as CI for linux-x86_64 and osx-x8_64? |
Thanks @daler! Shame, it's still not working 36 hours later. We should put in a reminder for early May 2025 to write Azure preemptively 🙃 |
I've pinned this issue as I've seen a couple of dupes popping up |
No hosted parallelism has been purchased or granted
No hosted parallelism has been purchased or granted
Any idea whether Github Actions would be faster or slower than Azure ? |
It has to be tested but #46775 shows that the osx-arm64 build is 2.0-2.5 times faster than Azure:
There is no Linux ARM64 build in this PR but usually CircleCI is also around 2 times faster than the Azure builds! But, both Github Actions and CircleCI is less used than Azure! That is, at the moment most PRs add load only to Azure and this might affect the build speed there! |
What's the reason Azure is used instead of free GitHub actions? The concurrency limit of 20? Are standard runners too small? Or something else? |
Is there an update of when this will be fixed? |
No! |
@daler is there an Azure support email address that we can get everyone to spam with a standard message to try to get them to notice? Or maybe we can do something on twitter to https://x.com/AzureSupport ? Feels like we have a lot of folks waiting for this, so could try to use our numbers to apply a little pressure :) |
Bugging people is to no avail, I fear 😝. Instead, it is nowadays likely just a matter of decent prompt engineering...😏:
|
Not against bugging people, but remember this is Azure DevOps, a different product than the Azure Cloud: https://learn.microsoft.com/en-us/azure/devops/user-guide/provide-feedback |
We are currently in communication the DevOps team and making progress. Will update as we learn more |
It seems the issue has been resolved! |
Nice that CI is back! |
Don't have that button, but Bioconda bot just pushed another commit, so the CI retriggered. Should work, I think? |
A bit of a post-mortem here: Through "contact mining" (contacts of contacts) we were able to find someone at Azure DevOps with some information. While the timing was suspicious of about 1 yr after the last time this happened, they said that what happened this time was that a job from this repo triggered a warning in their system. Turns out it was something silly: SnpEff phones home to a now-unregistered URL, "dnaminer.com" and that was enough for Azure DevOps to flag us as cryptomining (or something?). Anyway, this was addressed by @rpetit3 in #48007, and as you've seen they've reinstated us. Will this happen again? Who knows. It's unlikely they'll reveal everything they look for, because that would give an advantage to those looking to abuse the free compute. So we could always stumble into another red flag. But we at least have a contact we can ping in the future. In the meantime, we plan on working on consolidating all the various CI platforms into a more consistent configuration to allow easier swapping of systems to be more resilient in the future. A comment above asked about why we're using Azure DevOps in the first place, so here's some explanation. Our rationale for splitting across systems was to spread the load. The main jobs run on Azure DevOps. Bulk jobs (the lengthy Bioconductor updates, Python migrations, etc) run on CircleCI. The autobump bot runs on CircleCI too. The comment bot runs on GitHub Actions because we need the tight coupling of GitHub and GitHub Actions. If many jobs were run on GitHub Actions, where the comment bot also runs, the bot's responses could get delayed by concurrency limits. We have not formally tested load balancing across CI systems, and it didn't matter much anyway until we had this DevOps failure. But now we know we need some sort of hot-swap mechanism for when this happens again -- I think we'd happily pay a laggy-comment-bot cost to keep builds running. |
Fascinating write up, thanks @daler! |
Hi,
There seems to be an issue running checks within pull requests, with the Azure pipeline reporting the following during the lint step:
"##[error]No hosted parallelism has been purchased or granted. To request a free parallelism grant, please fill out the following form https://aka.ms/azpipelines-parallelism-request"
Here is a link to the most recent example:
https://dev.azure.com/bioconda/bioconda-recipes/_build/results?buildId=33719&view=results
I am not very familiar with bioconda, so please let me know if this is an issue with my specific pull request - but I see similar errors in other recent PRs.
Thank you,
Rauf
The text was updated successfully, but these errors were encountered: