You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To run the main Rosenbrock time stepper function (https://github.com/gridapapps/GridapGeosciences.jl/blob/master/src/ShallowWaterRosenbrock.jl#L8) from a parallel driver (while the parallel driver does not exist yet, we can take the Full Newton driver as a reference). This is likely to fail, as some of the lines in the Rosenbrock solver may not be prepared to be run from both environments (we now that it is prepared for serial environments, but didnt check in a parallel environment). Once we identify where it fails, we can try to understand why, and solve it, etc.
The good news is that we have the sequential driver to compare results against. These are our target results. It is not building something from scratch.
To run the main Rosenbrock time stepper function (https://github.com/gridapapps/GridapGeosciences.jl/blob/master/src/ShallowWaterRosenbrock.jl#L8) from a parallel driver (while the parallel driver does not exist yet, we can take the Full Newton driver as a reference). This is likely to fail, as some of the lines in the Rosenbrock solver may not be prepared to be run from both environments (we now that it is prepared for serial environments, but didnt check in a parallel environment). Once we identify where it fails, we can try to understand why, and solve it, etc.
The good news is that we have the sequential driver to compare results against. These are our target results. It is not building something from scratch.
@amartinhuertas can provide support along the way
Note that I have partially done this work. For example, the rosenbrock time stepper no longer calls lu, lu!, ldiv! ; these are Julia interfaces. Instead, I replaced these calls by their counterparts in Gridap solver interfaces (That GridapPETSc implements). See, e.g., https://github.com/gridapapps/GridapGeosciences.jl/blob/master/src/ShallowWaterRosenbrock.jl#L121
This is working in the serial case, as, e.g., jacobian_matrix_solver, defaults to lu https://github.com/gridapapps/GridapGeosciences.jl/blob/master/src/ShallowWaterRosenbrock.jl#L78
I would start running the parallel program on just a single MPI task. Once we know it works, I would move to more than a single task.
The text was updated successfully, but these errors were encountered: