-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Misc pending tasks associated with refactoring in branch release-0.2 #39
Comments
Let me document here a spare set of issues while I go through
|
Hi @amartinhuertas! I have opened a new branch https://github.com/gridap/GridapDistributed.jl/tree/release-0.2 from the current state. We can use this branch as code that "is working" and do intermediate devs somewhere else. |
@amartinhuertas I have classified the pending tasks as high/medium/low priority. (Move tasks from one tier not another if you find it necessary) For me, we can merge the branch once the high-priority test are fixed. The missing low/medium priority ones can be moved to separate issues. |
Ok. I have added a new high-priority task. Agreed. |
@amartinhuertas I am adding a nonlinear test and I have found a bug associated with the in-place assembly functions |
Yes, indeed I added |
I have added the nonlinear example. In particular,
|
Hi @amartinhuertas, @fverdugo, what are the developments needed for periodic BCs? |
At least, the ghost layer has to be implemented taking into account periodicity. And perhaps other devs. |
Hi @amartinhuertas and @fverdugo, the last task of the list can be checked out once PR #74 is accepted. Also, I suggest to add another task into the list: support |
Great @oriolcg ! Thanks for your work!
Sure. Task added.
I indeed implemented this in GridapDistributed.jl v0.1.0. See https://github.com/gridap/GridapDistributed.jl/blob/v0.1.0/src/ZeroMeanDistributedFESpaces.jl and https://github.com/gridap/GridapDistributed.jl/blob/v0.1.0/test/ZeroMeanDistributedFESpacesTests.jl. I guess it should be a matter to re-write this using the new code organization in v0.2.0. |
Merging into
master
PR #50High priority
pvtu
format (@amartinhuertas) PR Support parallel file formats WriteVTK.jl#1 PVTU files #46FESpacesTests.jl
namelyassemble_XXX!
functions do not work @fverdugo Nonlinear test (and misc associated fixes) #44MPI.COMM_WORLD
inget_part_ids(b::MPIBackend,...
PartitionedArrays/PartitionedArrays.jl#33 Duplicating communicator in MPI backend PartitionedArrays/PartitionedArrays.jl#35@assert
s locally triggered from the processors may lead to deadlock! We should callmpi_abort
whenever anAssertionError
exception is triggered. PR Improving prun() for MPIBackend. Adding prun_debug() for debugging purposes. PartitionedArrays/PartitionedArrays.jl#39Medium priority
lu!
for PSparseMatrix following the same strategy as with \NLSolve.jl
solver trace only in master MPI rank leads to a dead lock.Low priority
interpolate_everywhere
not available inGridapDistributed.jl
, onlyinterpolate
Solved in PR Dirichlet values #74The text was updated successfully, but these errors were encountered: