Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Port Rosenbrock solver to MPI-parallelism #33

Open
amartinhuertas opened this issue Feb 14, 2022 · 0 comments
Open

Port Rosenbrock solver to MPI-parallelism #33

amartinhuertas opened this issue Feb 14, 2022 · 0 comments
Assignees

Comments

@amartinhuertas
Copy link
Collaborator

To run the main Rosenbrock time stepper function (https://github.com/gridapapps/GridapGeosciences.jl/blob/master/src/ShallowWaterRosenbrock.jl#L8) from a parallel driver (while the parallel driver does not exist yet, we can take the Full Newton driver as a reference). This is likely to fail, as some of the lines in the Rosenbrock solver may not be prepared to be run from both environments (we now that it is prepared for serial environments, but didnt check in a parallel environment). Once we identify where it fails, we can try to understand why, and solve it, etc.

The good news is that we have the sequential driver to compare results against. These are our target results. It is not building something from scratch.

@amartinhuertas can provide support along the way

Note that I have partially done this work. For example, the rosenbrock time stepper no longer calls lu, lu!, ldiv! ; these are Julia interfaces. Instead, I replaced these calls by their counterparts in Gridap solver interfaces (That GridapPETSc implements). See, e.g., https://github.com/gridapapps/GridapGeosciences.jl/blob/master/src/ShallowWaterRosenbrock.jl#L121

This is working in the serial case, as, e.g., jacobian_matrix_solver, defaults to lu https://github.com/gridapapps/GridapGeosciences.jl/blob/master/src/ShallowWaterRosenbrock.jl#L78

I would start running the parallel program on just a single MPI task. Once we know it works, I would move to more than a single task.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants