-
Notifications
You must be signed in to change notification settings - Fork 142
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MPI support via package extensions (weak dependency) #1062
Conversation
There is a problem with the current approach to define
Just exporting a type from it is basically the current approach HDF5.jl takes with |
I implemented a workaround for the issue in 30cf866. I am not completely happy with it, but it is the best I can think of right now to keep compatibility with Julia v1.6 - v1.8. |
Package extensions are only supported in Julia v1.6 and newer. Since this compat bump is included in #1061, we can wait until the other PR is merged before continuing with this PR. |
fd590b3
to
30cf866
Compare
This should coordinate with JuliaPackaging/Yggdrasil#6551 |
@mkitti What exactly need to be coordinated? |
@eschnett As far as I know, we have never loaded a MPI-enabled HDF5_jll. For this package extension to work properly, we should examine if the loaded JLL package is actually MPI capable. |
As far as I understand, the current approach of HDF5.jl is as follows
That's still the same approach when rewriting it as a package extension. Thus, I am not sure I understand. Do you suggest to wait until JuliaPackaging/Yggdrasil#6551 and #1061 are merged before merging this PR? |
The first issue is awareness that both issues are coming down the pipeline. We should now be able to test versus the new HDF5_jlls. The problem here is that we are changing multiple components at once here, so we need to make sure to do integrated testing to ensure that changing either component alone or changing both does not create unexpected issues. Otherwise, it will get complicated to diagnose future problems. |
Ok. I will leave this PR open. Please let me know when you think it's a good time to revisit it |
I'm mainly just triaging here. It's @simonbyrne that is really needed to review this carefully. |
Should we just ditch Julia 1.3 support at this point? MPI.jl has a minimum Julia version of 1.6 (the current LTS) |
Yes, I think this would be reasonable. This compat bump is included in #1061. Thus, I think we can wait until the other PR is merged before continuing with this PR. |
We're just waiting for #1061 to be merged since it updates the minimum Julia version and the CI tests accordingly. |
Interesting. Could you see if this is true for HDF5 v0.16.14 and/or HDF5_jll. I do suspect this is MPI related. |
Not sure what this means exactly but the screenshot is for |
Yes, it's caused by the way how MPI is handled at the moment (I assume it's OmniPackage.jl depending on Trixi.jl depending on both MPI.jl and HDF5.jl) |
I suspect that the increase in driver load time is a recent regression. The new HDF5_jll builds make MPI always available, so we may loading the driver every time. |
I'm having some trouble reproducing, but perhaps I am missing something.
|
Could we get CI to pass? |
CI fails on Julia v1.3 since Preferencesjl requires Julia v1.6. Everything else passes here. I didn't want to update the lower bound on Julia here since that's already done in #1061 |
It's on JuliaComputing/OmniPackage.jl#67. |
@mkitti I think this is ready for a review |
What is going on with the nightly CI failure? I'll take a closer look this weekend. |
@simonbyrne , would you like another look? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me!
Great, thanks! |
This setup keeps using Requires.jl for Julia v1.8 and older but switches to package extensions for Julia v1.9.
With Julia v1.8.5, I get (for both this PR and the current release of HDF5.jl)
With Julia v1.9.0-rc1 and the current release of HDF5.jl, I get
With Julia v1.9.0-rc1 and this PR, I get
This avoids some compilation time whenever MPI and HDF5 are loaded together.