MPI-parallelization broken for triangular grids
I've tried to run a problem (porousmediumflow/2p/implicit/incompressible/test_2p_incompressible_tpfa) in parallel on triangular grids on our cluster. But the simulation failed and the solver output shows some nan values. There are no issues with structured Yaspgrids on parallel runs and serial runs on triangular grids. Maybe it's my fault, I've just changed the PropertyTag to UGgrid in the problem.hh.
Please find attached the adapted modified problem.hh, the grid, inputfile and logfile.
Release 2.12 runs in parallel on triangular grids on the same cluster, using the same grid and test/porousmediumflow/2p/implicit/test_cc2p. There might be a general problem with triangular grids that easily can be fixed. Maybe the data exchange is somehow broken?
problem.hhmesh.dgftest_2p.input dumux-log.out
serial run: dumux-log.out