dumux issueshttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues2018-08-30T07:15:33Zhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/561BrineCO2 phaseIdx assertion in test_fluidsystems2018-08-30T07:15:33ZSimon EmmertBrineCO2 phaseIdx assertion in test_fluidsystemsOur fluidsystems-test fails (line 159), because the assertion of ``assert(restrictPhaseIdx_ < 0 || restrictPhaseIdx_ == phaseIdx);`` fails for the pressure.
I guess this has to do with the brine-adapter, but could not fix it or it did n...Our fluidsystems-test fails (line 159), because the assertion of ``assert(restrictPhaseIdx_ < 0 || restrictPhaseIdx_ == phaseIdx);`` fails for the pressure.
I guess this has to do with the brine-adapter, but could not fix it or it did not make perfect sense to me right away. Does someone have an idea? If not this should be perfect for the dumux day.3.0https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/567test_vtkreader_2d3d fails due to assert2018-08-29T09:16:41ZTimo Kochtimokoch@math.uio.notest_vtkreader_2d3d fails due to assert```
test_vtkreader_2d3d: /data/src/dune-alugrid/dune/alugrid/impl/serial/gitter_mgb.cc:759: void ALUGrid::MacroGridBuilder::finalize(): Assertion `((hface3_GEO *)(*i).second)->ref == 2' failed.
```
Maybe something is wrong with the grid ...```
test_vtkreader_2d3d: /data/src/dune-alugrid/dune/alugrid/impl/serial/gitter_mgb.cc:759: void ALUGrid::MacroGridBuilder::finalize(): Assertion `((hface3_GEO *)(*i).second)->ref == 2' failed.
```
Maybe something is wrong with the grid file? The test passes however, if assertions are disabled.3.0https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/620Most tests with Neumann no-flow and velocity output fail2018-11-28T17:42:01ZTimo Kochtimokoch@math.uio.noMost tests with Neumann no-flow and velocity output fail!1319 changed the velocity output at Neumann no-flow boundaries. See buildbot for affected tests. References need to be adjusted (there should be only a different in the velocity field -> only change that field in the reference solution).!1319 changed the velocity output at Neumann no-flow boundaries. See buildbot for affected tests. References need to be adjusted (there should be only a different in the velocity field -> only change that field in the reference solution).3.0Martin SchneiderMartin Schneiderhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/621test_2p_fracture_nogravity_mpfa fails due to assertion2018-11-28T10:47:05ZTimo Kochtimokoch@math.uio.notest_2p_fracture_nogravity_mpfa fails due to assertion```
Initializing of the connectivity map took 0.0138908 seconds.
test_2p_fracture_mpfa: /data/src/dumux/dumux/discretization/cellcentered/mpfa/localfacedata.hh:78: LocalIndexType Dumux::InteractionVolumeLocalFaceData::scvfLocalOutsideS...```
Initializing of the connectivity map took 0.0138908 seconds.
test_2p_fracture_mpfa: /data/src/dumux/dumux/discretization/cellcentered/mpfa/localfacedata.hh:78: LocalIndexType Dumux::InteractionVolumeLocalFaceData::scvfLocalOutsideScvfIndex() const [with GridIndexType = unsigned int; LocalIndexType = unsigned char]: Assertion `isOutside_' failed.
[behandla:25835] *** Process received signal ***
```3.0Dennis GläserDennis Gläserhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/628Weird results of `test_md_boundary_darcy1p3c_stokes1p3c_horizontal`2020-03-26T19:27:10ZTimo Kochtimokoch@math.uio.noWeird results of `test_md_boundary_darcy1p3c_stokes1p3c_horizontal`On my system with macOS and clang, the test produces completely different results. Not just tiny differences. For example in the darcy domain I have a mixture of 30% CO2 70% H2, in some parts of the domain the other way around, while in ...On my system with macOS and clang, the test produces completely different results. Not just tiny differences. For example in the darcy domain I have a mixture of 30% CO2 70% H2, in some parts of the domain the other way around, while in the reference there is almost 100% H2.
[test_md_boundary_darcy1p3c_stokes1p3c_horizontal_darcy-00015.vtu](/uploads/841810caef76db6400770806c8caba29/test_md_boundary_darcy1p3c_stokes1p3c_horizontal_darcy-00015.vtu)
[test_md_boundary_darcy1p3c_stokes1p3c_horizontal_stokes-00015.vtu](/uploads/25d61113f8d4f30ccc3bcbc72fe9b22a/test_md_boundary_darcy1p3c_stokes1p3c_horizontal_stokes-00015.vtu)
@heck, @kweis Can you explain those results? Did you see something similar.3.2Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/629Box dfm tests fail with dune master2018-12-04T14:05:56ZTimo Kochtimokoch@math.uio.noBox dfm tests fail with dune mastersee BuildBotsee BuildBot3.0Dennis GläserDennis Gläserhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/638test_md_facet_1p1p_box_convergence fails with clang2018-12-19T15:30:47ZTimo Kochtimokoch@math.uio.notest_md_facet_1p1p_box_convergence fails with clang3.0Dennis GläserDennis Gläserhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/640A few tests fail without a direct solver present2018-12-20T16:10:15ZTimo Kochtimokoch@math.uio.noA few tests fail without a direct solver presentCould be the case that AMG uses a different smoother in that case. Can probably be made more stable by decreasing Newton tolerance. See buildbot's minimal build setups.Could be the case that AMG uses a different smoother in that case. Can probably be made more stable by decreasing Newton tolerance. See buildbot's minimal build setups.https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/641test_2p_incompressible_tpfa_oilwet fails with clang 6.0.02019-02-27T14:39:32ZSimon Emmerttest_2p_incompressible_tpfa_oilwet fails with clang 6.0.0The tests fails for me when I build with clang 6.0.0 on Ubuntu
We should think of an improved test for the future
```
239: Data differs in parameter: S_aq
239: Difference is too large: 1.73% -> between: 0.259405 and 0.254914
239: Info f...The tests fails for me when I build with clang 6.0.0 on Ubuntu
We should think of an improved test for the future
```
239: Data differs in parameter: S_aq
239: Difference is too large: 1.73% -> between: 0.259405 and 0.254914
239: Info for S_aq: max_abs_parameter_value=1.0 and min_abs_parameter_value=0.226833.
239:
239: Data differs in parameter: S_napl
239: Difference is too large: 100.00% -> between: 9.99547e-06 and 0.0
239: Info for S_napl: max_abs_parameter_value=0.773167 and min_abs_parameter_value=0.0.
239:
239: Data differs in parameter: mob_aq
239: Difference is too large: 3.16% -> between: 114.914 and 111.283
239: Info for mob_aq: max_abs_parameter_value=1000.0 and min_abs_parameter_value=10.849.
239:
239: Data differs in parameter: mob_napl
239: Difference is too large: 99.50% -> between: 0.00189577 and 9.41958e-06
239: Info for mob_napl: max_abs_parameter_value=1157.57 and min_abs_parameter_value=0.0.
239:
239: Data differs in parameter: pc
239: Difference is too large: 99.55% -> between: 0.000934299 and 4.21307e-06
239: Info for pc: max_abs_parameter_value=2529.62 and min_abs_parameter_value=0.0.
1/1 Test #239: test_2p_incompressible_tpfa_oilwet ...***Failed 1.63 sec
```https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/642High residuals in poromechanics test el1p/el2p2019-01-02T14:52:45ZTimo Kochtimokoch@math.uio.noHigh residuals in poromechanics test el1p/el2p3.1Dennis GläserDennis Gläserhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/667Failing tests due to !15072019-02-28T15:14:00ZTimo Kochtimokoch@math.uio.noFailing tests due to !1507The MR !1507 changed the input file which is used for several tests but only adjusts one reference solution. The parameter should be instead just adjusted in the CMakelists for that specific test.The MR !1507 changed the input file which is used for several tests but only adjusts one reference solution. The parameter should be instead just adjusted in the CMakelists for that specific test.3.1https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/673Tracer box facet multidomain test fails. Make more robust2019-03-21T13:18:47ZTimo Kochtimokoch@math.uio.noTracer box facet multidomain test fails. Make more robust```
Fuzzy comparison...
Comparing /data/src/dumux/test/references/test_md_facet_tracertracer_box_tracer_bulk-reference.vtu and /data/build/dumux/test/multidomain/facet/tracer_tracer/test_md_facet_tracertracer_box_tracer_bulk-00010.vtu
...```
Fuzzy comparison...
Comparing /data/src/dumux/test/references/test_md_facet_tracertracer_box_tracer_bulk-reference.vtu and /data/build/dumux/test/multidomain/facet/tracer_tracer/test_md_facet_tracertracer_box_tracer_bulk-00010.vtu
... with a maximum relative error of 0.01 and a maximum absolute error of 1.5e-07*max_abs_parameter_value.
Data differs in parameter: X^tracer_0
Difference is too large: 103.05% -> between: 6.35046e-09 and -1.93402e-10
Info for X^tracer_0: max_abs_parameter_value=0.00357894 and min_abs_parameter_value=0.0.
Data differs in parameter: x^tracer_0
Difference is too large: 103.05% -> between: 3.81028e-07 and -1.16041e-08
Info for x^tracer_0: max_abs_parameter_value=0.214736 and min_abs_parameter_value=0.0.
Fuzzy comparison...
Comparing /data/src/dumux/test/references/test_md_facet_tracertracer_box_tracer_lowdim-reference.vtp and /data/build/dumux/test/multidomain/facet/tracer_tracer/test_md_facet_tracertracer_box_tracer_lowdim-00010.vtp
... with a maximum relative error of 0.01 and a maximum absolute error of 1.5e-07*max_abs_parameter_value.
Data differs in parameter: X^tracer_0
Difference is too large: 125.11% -> between: 1.44095e-07 and -5.73891e-07
Info for X^tracer_0: max_abs_parameter_value=0.000397662 and min_abs_parameter_value=0.0.
Data differs in parameter: x^tracer_0
Difference is too large: 125.11% -> between: 8.64572e-06 and -3.44334e-05
Info for x^tracer_0: max_abs_parameter_value=0.0238597 and min_abs_parameter_value=0.0.
Test #58: test_md_facet_tracertracer_box .............................***Failed 10.39 sec
```3.1Dennis GläserDennis Gläserhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/674Test make geometry fails sporadically2019-10-02T14:58:43ZTimo Kochtimokoch@math.uio.noTest make geometry fails sporadicallyFrom time to time the make geometry tests fails. It does so only sporadically, so there seems to be some random component.
```
testing for non axis-aligned quadrilateral
testing for quadrilateral with normal in z direction
testing for...From time to time the make geometry tests fails. It does so only sporadically, so there seems to be some random component.
```
testing for non axis-aligned quadrilateral
testing for quadrilateral with normal in z direction
testing for quadrilateral with normal in y direction
testing for quadrilateral with normal in x direction
Dune::InvalidStateException [permutatePointsAndTest:/data/src/dumux/test/common/geometry/test_makegeometry.cc:69]: Area of quadrilateral after permuation of input points is wrong
Test #9: test_makegeometry ................***Failed 0.23 sec
```3.1Kilian WeishauptKilian Weishaupthttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/678[parallel/amg] Some parallel or amg tests fail with dune master2019-06-25T12:30:48ZTimo Kochtimokoch@math.uio.no[parallel/amg] Some parallel or amg tests fail with dune masterSame tests pass with dune 2.6Same tests pass with dune 2.6https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/685Appl lens2pexercise3 in dumux-lecture fails after commit updating the timeloop2019-04-06T11:01:54ZTimo Kochtimokoch@math.uio.noAppl lens2pexercise3 in dumux-lecture fails after commit updating the timeloopPossible bug in the timeloop for large time step sizes? Fails since 2fbb615bc761c7c3fa860ced5901204eb15b5975Possible bug in the timeloop for large time step sizes? Fails since 2fbb615bc761c7c3fa860ced5901204eb15b59753.1https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/695Failing non-equil tests after merging !14962019-05-04T13:49:31ZTimo Kochtimokoch@math.uio.noFailing non-equil tests after merging !1496After merging !1496 `test_1p2c_nonequilibrium_tpfa` `test_mpnc_kinetic_box` fail now on buildbot.
In the case of `test_1p2c_nonequilibrium_tpfa` an fv geometry cache is accessed which is not in the stencil.
```
Test command: /data/src/...After merging !1496 `test_1p2c_nonequilibrium_tpfa` `test_mpnc_kinetic_box` fail now on buildbot.
In the case of `test_1p2c_nonequilibrium_tpfa` an fv geometry cache is accessed which is not in the stencil.
```
Test command: /data/src/dumux/bin/testing/runtest.py "--script" "fuzzy" "--files" "/data/src/dumux/test/references/test_1p2c_nonequilibrium_tpfa-reference.vtu" "/data/build/dumux/test/porousmediumflow/1pnc/implicit/nonequilibrium/test_1p2c_nonequilibrium_tpfa-00044.vtu" "--command" "/data/build/dumux/test/porousmediumflow/1pnc/implicit/nonequilibrium/test_1p2c_nonequilibrium_tpfa params.input -Problem.Name test_1p2c_nonequilibrium_tpfa" "--zeroThreshold" "{"velocity_liq (m/s)_1":1e-15}"
Test timeout computed to be: 300
You idiot! You signed the order to destroy Earth!
- Douglas Adams, HGttG
Reading parameters from file params.input.
The H2O-N2 fluid system was configured with the following policy:
- use H2O density as liquid mixture density: true
- use ideal gas density: true
- use N2 viscosity as gas mixture viscosity: true
- use N2 heat conductivity as gas mixture heat conductivity: true
- use ideal gas heat capacities: true
-------------------------------------------------------------------------
Initializing tables for the H2O fluid properties (20000 entries).
Temperature -> min: 2.731e+02, max: 6.231e+02, n: 100
Pressure -> min: 0.000e+00, max: 2.000e+07, n: 200
-------------------------------------------------------------------------
problem uses mole fractions
Writing output for problem "test_1p2c_nonequilibrium_tpfa". Took 8.889e-01 seconds.
Assemble: r(x^k) = dS/dt + div F - q; M = grad rtest_1p2c_nonequilibrium_tpfa: /data/src/dumux/dumux/discretization/cellcentered/tpfa/fvelementgeometry.hh:536: const LocalIndexType Dumux::CCTpfaFVElementGeometry::findLocalIndex(Dumux::CCTpfaFVElementGeometry::GridIndexType, const std::vector::GridIndex>&) const [with GG = Dumux::CCTpfaFVGridGeometry > >, false, Dumux::CCTpfaDefaultGridGeometryTraits > >, Dumux::DefaultMapperTraits > >, Dune::MultipleCodimMultipleGeomTypeMapper > >, Dune::Impl::MCMGFailLayout>, Dune::MultipleCodimMultipleGeomTypeMapper > >, Dune::Impl::MCMGFailLayout> > > >; Dumux::CCTpfaFVElementGeometry::LocalIndexType = unsigned int; Dumux::CCTpfaFVElementGeometry::GridIndexType = unsigned int; typename Dumux::IndexTraits::GridIndex = unsigned int]: Assertion `it != indices.end() && "Could not find the scv/scvf! Make sure to properly bind this class!"' failed.
[behandla:19637] *** Process received signal ***
[behandla:19637] Signal: Aborted (6)
[behandla:19637] Signal code: (-6)
[behandla:19637] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x12890)[0x7f3fc9448890]
[behandla:19637] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0xc7)[0x7f3fc9083e97]
[behandla:19637] [ 2] /lib/x86_64-linux-gnu/libc.so.6(abort+0x141)[0x7f3fc9085801]
[behandla:19637] [ 3] /lib/x86_64-linux-gnu/libc.so.6(+0x3039a)[0x7f3fc907539a]
[behandla:19637] [ 4] /lib/x86_64-linux-gnu/libc.so.6(+0x30412)[0x7f3fc9075412]
[behandla:19637] [ 5] /data/build/dumux/test/porousmediumflow/1pnc/implicit/nonequilibrium/test_1p2c_nonequilibrium_tpfa(+0xd0faf)[0x55c4b4287faf]
[behandla:19637] [ 6] /data/build/dumux/test/porousmediumflow/1pnc/implicit/nonequilibrium/test_1p2c_nonequilibrium_tpfa(_ZNK5Dumux41NonEquilibriumLocalResidualImplementationINS_10Properties4TTag35OnePTwoCThermalNonequilibriumCCTpfaELb0EE11computeFluxERKNS_36OnePTwoCThermalNonequilibriumProblemIS3_EERKN4Dune6EntityILi0ELi2EKNS9_8YaspGridILi2ENS9_22EquidistantCoordinatesIdLi2EEEEENS9_10YaspEntityEEERKNS_23CCTpfaFVElementGeometryINS_20CCTpfaFVGridGeometryINS9_8GridViewINS9_25DefaultLeafGridViewTraitsISF_EEEELb0ENS_31CCTpfaDefaultGridGeometryTraitsISP_NS_19DefaultMapperTraitsISP_NS9_35MultipleCodimMultipleGeomTypeMapperISP_NS9_4Impl14MCMGFailLayoutEEESV_EEEEEELb0EEERKNS_28CCTpfaElementVolumeVariablesINS_21CCGridVolumeVariablesINS_38CCTpfaDefaultGridVolumeVariablesTraitsIS6_NS_43NonEquilibriumVolumeVariablesImplementationINS_27OnePNCVolumeVariablesTraitsINS9_11FieldVectorIdLi4EEENS_12FluidSystems11OnePAdapterINS19_5H2ON2IdNS19_18H2ON2DefaultPolicyILb1EEEEELi0EEENS_24NonEquilibriumFluidStateIdS1F_EENS_12SolidSystems15InertSolidPhaseIdNS_10Components8ConstantILi1EdEEEENS_15InertSolidStateIdS1N_EEdNS_25NonEquilibriumModelTraitsINS_30OnePNCUnconstrainedModelTraitsINS_17OnePNCModelTraitsILi2ELb1ELi2EEEEELb0ELb1ELi1ELi1ELNS_18NusseltFormulationE1ELNS_19SherwoodFormulationE0EEEEENS_21OnePNCVolumeVariablesIS1Y_EELb0ELb1ELi1EEEEELb0EEELb0EEERKNS_26CCTpfaSubControlVolumeFaceISP_NS_31CCTpfaDefaultScvfGeometryTraitsISP_EEEERKNS_31CCTpfaElementFluxVariablesCacheINS_28CCTpfaGridFluxVariablesCacheIS6_NS_44PorousMediumFluxVariablesCacheImplementationIS3_LNS_20DiscretizationMethodE2EEENS_50PorousMediumFluxVariablesCacheFillerImplementationIS3_LS2G_2EEELb0ENS_26CCTpfaDefaultGridFVCTraitsIS6_S2H_S2J_EEEELb0EEE+0x1713)[0x55c4b431b873]
[behandla:19637] [ 7] /data/build/dumux/test/porousmediumflow/1pnc/implicit/nonequilibrium/test_1p2c_nonequilibrium_tpfa(_ZNK5Dumux15CCLocalResidualINS_10Properties4TTag35OnePTwoCThermalNonequilibriumCCTpfaEE8evalFluxERKNS_36OnePTwoCThermalNonequilibriumProblemIS3_EERKN4Dune6EntityILi0ELi2EKNS9_8YaspGridILi2ENS9_22EquidistantCoordinatesIdLi2EEEEENS9_10YaspEntityEEERKNS_23CCTpfaFVElementGeometryINS_20CCTpfaFVGridGeometryINS9_8GridViewINS9_25DefaultLeafGridViewTraitsISF_EEEELb0ENS_31CCTpfaDefaultGridGeometryTraitsISP_NS_19DefaultMapperTraitsISP_NS9_35MultipleCodimMultipleGeomTypeMapperISP_NS9_4Impl14MCMGFailLayoutEEESV_EEEEEELb0EEERKNS_28CCTpfaElementVolumeVariablesINS_21CCGridVolumeVariablesINS_38CCTpfaDefaultGridVolumeVariablesTraitsIS6_NS_43NonEquilibriumVolumeVariablesImplementationINS_27OnePNCVolumeVariablesTraitsINS9_11FieldVectorIdLi4EEENS_12FluidSystems11OnePAdapterINS19_5H2ON2IdNS19_18H2ON2DefaultPolicyILb1EEEEELi0EEENS_24NonEquilibriumFluidStateIdS1F_EENS_12SolidSystems15InertSolidPhaseIdNS_10Components8ConstantILi1EdEEEENS_15InertSolidStateIdS1N_EEdNS_25NonEquilibriumModelTraitsINS_30OnePNCUnconstrainedModelTraitsINS_17OnePNCModelTraitsILi2ELb1ELi2EEEEELb0ELb1ELi1ELi1ELNS_18NusseltFormulationE1ELNS_19SherwoodFormulationE0EEEEENS_21OnePNCVolumeVariablesIS1Y_EELb0ELb1ELi1EEEEELb0EEELb0EEERKNS_31CCTpfaElementFluxVariablesCacheINS_28CCTpfaGridFluxVariablesCacheIS6_NS_44PorousMediumFluxVariablesCacheImplementationIS3_LNS_20DiscretizationMethodE2EEENS_50PorousMediumFluxVariablesCacheFillerImplementationIS3_LS2A_2EEELb0ENS_26CCTpfaDefaultGridFVCTraitsIS6_S2B_S2D_EEEELb0EEERKNS_26CCTpfaSubControlVolumeFaceISP_NS_31CCTpfaDefaultScvfGeometryTraitsISP_EEEE+0x2e3)[0x55c4b431bf03]
[behandla:19637] [ 8] /data/build/dumux/test/porousmediumflow/1pnc/implicit/nonequilibrium/test_1p2c_nonequilibrium_tpfa(_ZNK5Dumux15FVLocalResidualINS_10Properties4TTag35OnePTwoCThermalNonequilibriumCCTpfaEE17evalFluxAndSourceERKN4Dune6EntityILi0ELi2EKNS5_8YaspGridILi2ENS5_22EquidistantCoordinatesIdLi2EEEEENS5_10YaspEntityEEERKNS_23CCTpfaFVElementGeometryINS_20CCTpfaFVGridGeometryINS5_8GridViewINS5_25DefaultLeafGridViewTraitsISB_EEEELb0ENS_31CCTpfaDefaultGridGeometryTraitsISL_NS_19DefaultMapperTraitsISL_NS5_35MultipleCodimMultipleGeomTypeMapperISL_NS5_4Impl14MCMGFailLayoutEEESR_EEEEEELb0EEERKNS_28CCTpfaElementVolumeVariablesINS_21CCGridVolumeVariablesINS_38CCTpfaDefaultGridVolumeVariablesTraitsINS_36OnePTwoCThermalNonequilibriumProblemIS3_EENS_43NonEquilibriumVolumeVariablesImplementationINS_27OnePNCVolumeVariablesTraitsINS5_11FieldVectorIdLi4EEENS_12FluidSystems11OnePAdapterINS17_5H2ON2IdNS17_18H2ON2DefaultPolicyILb1EEEEELi0EEENS_24NonEquilibriumFluidStateIdS1D_EENS_12SolidSystems15InertSolidPhaseIdNS_10Components8ConstantILi1EdEEEENS_15InertSolidStateIdS1L_EEdNS_25NonEquilibriumModelTraitsINS_30OnePNCUnconstrainedModelTraitsINS_17OnePNCModelTraitsILi2ELb1ELi2EEEEELb0ELb1ELi1ELi1ELNS_18NusseltFormulationE1ELNS_19SherwoodFormulationE0EEEEENS_21OnePNCVolumeVariablesIS1W_EELb0ELb1ELi1EEEEELb0EEELb0EEERKNS_31CCTpfaElementFluxVariablesCacheINS_28CCTpfaGridFluxVariablesCacheIS12_NS_44PorousMediumFluxVariablesCacheImplementationIS3_LNS_20DiscretizationMethodE2EEENS_50PorousMediumFluxVariablesCacheFillerImplementationIS3_LS28_2EEELb0ENS_26CCTpfaDefaultGridFVCTraitsIS12_S29_S2B_EEEELb0EEERKNS_22CCElementBoundaryTypesE+0x341)[0x55c4b431c3e1]
[behandla:19637] [ 9] /data/build/dumux/test/porousmediumflow/1pnc/implicit/nonequilibrium/test_1p2c_nonequilibrium_tpfa(_ZNK5Dumux20FVLocalAssemblerBaseINS_10Properties4TTag35OnePTwoCThermalNonequilibriumCCTpfaENS_11FVAssemblerIS3_LNS_10DiffMethodE0ELb1EEENS_16CCLocalAssemblerIS3_S6_LS5_0ELb1EEELb1EE17evalLocalResidualEv+0xd6)[0x55c4b431ccb6]
[behandla:19637] [10] /data/build/dumux/test/porousmediumflow/1pnc/implicit/nonequilibrium/test_1p2c_nonequilibrium_tpfa(_ZN5Dumux16CCLocalAssemblerINS_10Properties4TTag35OnePTwoCThermalNonequilibriumCCTpfaENS_11FVAssemblerIS3_LNS_10DiffMethodE0ELb1EEELS5_0ELb1EE31assembleJacobianAndResidualImplERN4Dune10BCRSMatrixINS8_11FieldMatrixIdLi4ELi4EEESaISB_EEERNS_27NonEquilibriumGridVariablesIS3_EE+0x74d)[0x55c4b433ef4d]
[behandla:19637] [11] /data/build/dumux/test/porousmediumflow/1pnc/implicit/nonequilibrium/test_1p2c_nonequilibrium_tpfa(_ZNK5Dumux11FVAssemblerINS_10Properties4TTag35OnePTwoCThermalNonequilibriumCCTpfaELNS_10DiffMethodE0ELb1EE9assemble_IZNS5_27assembleJacobianAndResidualINS_18PartialReassemblerIS5_EEEEvRKN4Dune11BlockVectorINSA_11FieldVectorIdLi4EEESaISD_EEEPKT_EUlRKNSA_6EntityILi0ELi2EKNSA_8YaspGridILi2ENSA_22EquidistantCoordinatesIdLi2EEEEENSA_10YaspEntityEEEE_EEvOSI_+0x132)[0x55c4b4340b52]
[behandla:19637] [12] /data/build/dumux/test/porousmediumflow/1pnc/implicit/nonequilibrium/test_1p2c_nonequilibrium_tpfa(main+0xeb3)[0x55c4b4282c53]
[behandla:19637] [13] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xe7)[0x7f3fc9066b97]
[behandla:19637] [14] /data/build/dumux/test/porousmediumflow/1pnc/implicit/nonequilibrium/test_1p2c_nonequilibrium_tpfa(_start+0x2a)[0x55c4b4286a2a]
[behandla:19637] *** End of error message ***
63/186 Test #63: test_1p2c_nonequilibrium_tpfa .....................***Failed 1.82 sec
```
```
Test command: /data/src/dumux/bin/testing/runtest.py "--script" "fuzzy" "--files" "/data/src/dumux/test/references/test_mpnc_kinetic_box-reference.vtu" "/data/build/dumux/test/porousmediumflow/mpnc/implicit/kinetic/test_mpnc_kinetic_box-00011.vtu" "--command" "/data/build/dumux/test/porousmediumflow/mpnc/implicit/kinetic/test_mpnc_kinetic_box params.input -Problem.Name test_mpnc_kinetic_box"
Test timeout computed to be: 300
Chuck Norris has successfully compiled DuMuX.
Reading parameters from file params.input.
The H2O-N2 fluid system was configured with the following policy:
- use H2O density as liquid mixture density: true
- use ideal gas density: true
- use N2 viscosity as gas mixture viscosity: true
- use N2 heat conductivity as gas mixture heat conductivity: true
- use ideal gas heat capacities: true
-------------------------------------------------------------------------
Initializing tables for the H2O fluid properties (10000 entries).
Temperature -> min: 2.780e+02, max: 4.531e+02, n: 100
Pressure -> min: 7.500e+04, max: 2.250e+05, n: 100
-------------------------------------------------------------------------
Writing output for problem "test_mpnc_kinetic_box". Took 6.039e-01 seconds.
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble/solve/update time: 1.278e+00(4.768e+01%)/1.402e+00(5.231e+01%)/3.405e-04(1.270e-02%)
Writing output for problem "test_mpnc_kinetic_box". Took 4.812e-02 seconds.
[ 1%] Time step 1 done in 2.736643 seconds. Wall clock time: 2.785, time: 0.05000, time step size: 0.05000000
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble/solve/update time: 0.95099143(48.46939834%)/1.01091940(51.52376100%)/0.00013422(0.00684067%)
Writing output for problem "test_mpnc_kinetic_box". Took 0.04853347 seconds.
[ 1%] Time step 2 done in 2.051791 seconds. Wall clock time: 4.837, time: 0.12500, time step size: 0.07500000
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble/solve/update time: 0.94914820(48.59345643%)/1.00394319(51.39879041%)/0.00015144(0.00775316%)
Writing output for problem "test_mpnc_kinetic_box". Took 0.04807224 seconds.
[ 2%] Time step 3 done in 2.043512 seconds. Wall clock time: 6.880, time: 0.24375, time step size: 0.11875000
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble/solve/update time: 0.95115055(48.39749146%)/1.01398697(51.59480356%)/0.00015143(0.00770497%)
Writing output for problem "test_mpnc_kinetic_box". Took 0.04845110 seconds.
[ 4%] Time step 4 done in 2.055532 seconds. Wall clock time: 8.936, time: 0.43177, time step size: 0.18802083
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble/solve/update time: 0.95967631(48.69862390%)/1.01083155(51.29448896%)/0.00013572(0.00688714%)
Writing output for problem "test_mpnc_kinetic_box". Took 0.04814657 seconds.
[ 7%] Time step 5 done in 2.060660 seconds. Wall clock time: 10.996, time: 0.72947, time step size: 0.29769965
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble/solve/update time: 1.26521772(48.89266931%)/1.32233187(51.09977040%)/0.00019564(0.00756029%)
Writing output for problem "test_mpnc_kinetic_box". Took 0.04806234 seconds.
[ 12%] Time step 6 done in 2.691426 seconds. Wall clock time: 13.688, time: 1.20083, time step size: 0.47135778
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble/solve/update time: 1.27771621(48.82405213%)/1.33907903(51.16884610%)/0.00018585(0.00710177%)
Writing output for problem "test_mpnc_kinetic_box". Took 0.04817870 seconds.
[ 19%] Time step 7 done in 2.720932 seconds. Wall clock time: 16.409, time: 1.90786, time step size: 0.70703668
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble/solve/update time: 1.27635560(48.32649627%)/1.36457241(51.66663862%)/0.00018132(0.00686511%)
Writing output for problem "test_mpnc_kinetic_box". Took 0.04837157 seconds.
[ 30%] Time step 8 done in 2.745505 seconds. Wall clock time: 19.154, time: 2.96842, time step size: 1.06055501
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble/solve/update time: 1.28703688(48.70908916%)/1.35506024(51.28349543%)/0.00019594(0.00741542%)
Writing output for problem "test_mpnc_kinetic_box". Took 0.04841897 seconds.
[ 46%] Time step 9 done in 2.747992 seconds. Wall clock time: 21.902, time: 4.55925, time step size: 1.59083252
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble/solve/update time: 1.27011454(48.08293266%)/1.37121299(51.91023325%)/0.00018052(0.00683409%)
Writing output for problem "test_mpnc_kinetic_box". Took 0.04838746 seconds.
[ 69%] Time step 10 done in 2.745616 seconds. Wall clock time: 24.648, time: 6.94550, time step size: 2.38624878
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble: r(x^k) = dS/dt + div F - q; M = grad r
Assemble/solve/update time: 1.27372658(48.61208584%)/1.34627708(51.38099319%)/0.00018134(0.00692096%)
Writing output for problem "test_mpnc_kinetic_box". Took 0.04840041 seconds.
[100%] Time step 11 done in 2.725102 seconds. Wall clock time: 27.373, time: 10.00000, time step size: 3.05449874
Simulation took 27.37315323 seconds on 1 processes.
The cumulative CPU time was 27.37315323 seconds.
# Runtime-specified parameters used:
[ BoundaryConditions ]
TInject = "293"
massFluxInjectedPhase = "0.75"
percentOfEquil = ".1"
[ Component ]
SolidDensity = "2600"
SolidHeatCapacity = "817"
SolidThermalConductivity = "3"
[ FluidSystem ]
nPressure = "100"
nTemperature = "100"
[ Grid ]
Cells0 = "14"
Cells1 = "15 15"
Grading0 = "1.0"
Grading1 = "-0.833333 0.833333"
Positions0 = "0.0 1.0"
Positions1 = "0.0 0.25 0.5"
[ InitialConditions ]
SwFFInitial = "1e-4"
SwPMInitial = "0.8"
TInitial = "293"
pnInitial = "1e5"
pnInjection = "100003"
[ Problem ]
Name = "test_mpnc_kinetic_box"
[ SpatialParams ]
[ SpatialParams.FreeFlow ]
meanPoreSize = "1e-2"
permeability = "1e-6"
porosity = "0.99"
[ SpatialParams.PorousMedium ]
factorEnergyTransfer = "1"
factorMassTransfer = "1"
meanPoreSize = "1e-4"
permeability = "1e-11"
porosity = "0.4"
[ SpatialParams.soil ]
BCPd = "2.290e+03"
BClambda = "2.740e+00"
Snr = "0"
Swr = "0"
aNonWettingSolidA1 = "1.369e+03"
aNonWettingSolidA2 = "-3.782e+00"
aNonWettingSolidA3 = "1.063e-09"
aWettingNonWettingA1 = "-1.603e-01"
aWettingNonWettingA2 = "1.429e-05"
aWettingNonWettingA3 = "1.915e-01"
[ TimeLoop ]
DtInitial = "0.05"
TEnd = "10"
[ Vtk ]
AddVelocity = "1"
# Global default parameters used:
[ Assembly ]
NumericDifferenceMethod = "1"
[ Flux ]
UpwindWeight = "1.0"
[ LinearSolver ]
MaxIterations = "250"
PreconditionerIterations = "1"
PreconditionerRelaxation = "1.0"
ResidualReduction = "1e-13"
Verbosity = "0"
[ Newton ]
EnableAbsoluteResidualCriterion = "false"
EnableChop = "false"
EnablePartialReassembly = "false"
EnableResidualCriterion = "false"
EnableShiftCriterion = "true"
MaxAbsoluteResidual = "1e-5"
MaxRelativeShift = "1e-8"
MaxSteps = "18"
ResidualReduction = "1e-5"
SatisfyResidualAndShiftCriterion = "false"
TargetSteps = "10"
UseLineSearch = "false"
[ Problem ]
EnableGravity = "true"
[ TimeLoop ]
MaxTimeStepSize = "1e300"
[ Vtk ]
AddProcessRank = "true"
# Unused parameters:
SpatialParams.soil.VGAlpha = "3.512e-04"
SpatialParams.soil.VGN = "4.716e+00"
SpatialParams.soil.specificSolidsurface = "4022.994"
SourceSink.heatIntoSolid = "0"
Constants.nRestart = "100"
FluidSystem.hammer = "1e4"
Chuck Norris has compiled DuMuX even two times in a row!
Fuzzy comparison...
Comparing /data/src/dumux/test/references/test_mpnc_kinetic_box-reference.vtu and /data/build/dumux/test/porousmediumflow/mpnc/implicit/kinetic/test_mpnc_kinetic_box-00011.vtu
... with a maximum relative error of 0.01 and a maximum absolute error of 1.5e-07*max_abs_parameter_value.
Data differs in parameter: velocity_gas (m/s)_1
Difference is too large: 8.70% -> between: 1.91128e-06 and 1.74495e-06
Info for velocity_gas (m/s)_1: max_abs_parameter_value=0.564606 and min_abs_parameter_value=1.74495e-06.
Data differs in parameter: velocity_liq (m/s)_0
Difference is too large: 1.00% -> between: 6.90209e-08 and 6.97191e-08
Info for velocity_liq (m/s)_0: max_abs_parameter_value=7.2564e-08 and min_abs_parameter_value=8.35171e-16.
Data differs in parameter: x^H2O_gas
Difference is too large: 55.46% -> between: 0.023197 and 0.0103322
Info for x^H2O_gas: max_abs_parameter_value=0.0232059 and min_abs_parameter_value=0.00232059.
Data differs in parameter: x^N2_gas
Difference is too large: 1.30% -> between: 0.976803 and 0.989668
Info for x^N2_gas: max_abs_parameter_value=0.997679 and min_abs_parameter_value=0.976794.
Test #155: test_mpnc_kinetic_box .............................***Failed 29.19 sec
```3.1Katharina HeckKatharina Heckhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/7092p2c water-air and fuelcell/remediation tests fail due to !16152019-05-28T08:54:20ZSimon Emmert2p2c water-air and fuelcell/remediation tests fail due to !1615The following tests fail on buildbot due to the changes in the viscosity calculation from !1615
dumux (some have a singular matrix and do not converge at all)
```
119 - test_2p2cni_waterair_box (Failed)
120 - test_2p2cni_waterair_buoyan...The following tests fail on buildbot due to the changes in the viscosity calculation from !1615
dumux (some have a singular matrix and do not converge at all)
```
119 - test_2p2cni_waterair_box (Failed)
120 - test_2p2cni_waterair_buoyancy_box (Failed)
121 - test_2p2cni_waterair_tpfa (Failed)
```
dumux-lecture (here only the reference solution does not match anymore)
```
11 - fuelcell (Failed)
22 - remediationscenariosexercise (Failed)
```https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/716test adaptive2p2c3d fails2019-06-17T13:30:48ZSimon Emmerttest adaptive2p2c3d failsThe test fails on buildbot and we will document and investigate here why.
The fluidstate was adapted, and the reference solution changed accordingly. Apparently the test is very sensitive (which makes sense) and gives different solution...The test fails on buildbot and we will document and investigate here why.
The fluidstate was adapted, and the reference solution changed accordingly. Apparently the test is very sensitive (which makes sense) and gives different solutions on different machines (compilers). I will try to test it on other machines and see if I can find anything systematic.3.1https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/720Merge of feature/improve-parameters breaks tests2019-06-04T09:25:06ZKilian WeishauptMerge of feature/improve-parameters breaks testsAfter the merge of 5c1c052467bbe86f903dac3f1de4990c41a03320, at least one test (test_ff_stokes_channel_3d) fails
because the output file changed its name from
`test_ff_stokes_channel_3d-00001.vtu` to `test_stokes_channel_3d-00001.vtu`
...After the merge of 5c1c052467bbe86f903dac3f1de4990c41a03320, at least one test (test_ff_stokes_channel_3d) fails
because the output file changed its name from
`test_ff_stokes_channel_3d-00001.vtu` to `test_stokes_channel_3d-00001.vtu`
`CMakeLists.txt` explicitly states
```
-Problem.Name test_ff_stokes_channel_3d
```
but somehow this is ignored and the name given in `parmas.input` is used.3.1Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/731sincos-test fails because numpy is not available on buildbot2019-06-27T18:07:59ZSimon Emmertsincos-test fails because numpy is not available on buildbot<!--
This form is for bug reports ONLY!
If you're looking for help check out the [readme](/README.md).
-->
**Bug report**
The `test_ff_navierstokes_sincos` fails (only!) on buildbot
**What happened / Problem description**:
The `conver...<!--
This form is for bug reports ONLY!
If you're looking for help check out the [readme](/README.md).
-->
**Bug report**
The `test_ff_navierstokes_sincos` fails (only!) on buildbot
**What happened / Problem description**:
The `convergencetest.py` includes numpy, but numpy is not available on buildbot. We should try to get rid of numpy in this case. I did not look into the files yet, but the other convergence tests do not need numpy and were desigend to work without numpy.
According to @timok installing numpy on buildbot is not wanted because it is quite large.
**Environment**:
- Dune version: master
- DuMux version: master
- Others: gcc, Python version: 2.7.93.1https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/732test_ff_stokes_channel_3d fails with different velocity2019-07-01T07:32:34ZSimon Emmerttest_ff_stokes_channel_3d fails with different velocity<!--
This form is for bug reports ONLY!
If you're looking for help check out the [readme](/README.md).
-->
**Bug report**
`test_ff_stokes_channel_3d fails`with different velocity
**What happened / Problem description**:
!1637 changed ...<!--
This form is for bug reports ONLY!
If you're looking for help check out the [readme](/README.md).
-->
**Bug report**
`test_ff_stokes_channel_3d fails`with different velocity
**What happened / Problem description**:
!1637 changed the handling of inflow/outflow BC and adapted the reference solution, yet the test fails on buildbot with
```
Data differs in parameter: velocity_liq (m/s)_1
Difference is too large: 199.76% -> between: -1.65925e-12 and 1.6632e-12
Info for velocity_liq (m/s)_1: max_abs_parameter_value=6.83831e-12 and min_abs_parameter_value=0.0.
For parameter velocity_liq (m/s)_1 a zero value threshold of 1e-12 was given.
Data differs in parameter: velocity_liq (m/s)_2
Difference is too large: 127.12% -> between: -4.17537e-13 and 1.53955e-12
Info for velocity_liq (m/s)_2: max_abs_parameter_value=1.89138e-12 and min_abs_parameter_value=0.0.
For parameter velocity_liq (m/s)_2 a zero value threshold of 1e-12 was given.
Test #17: test_ff_stokes_channel_3d ...............................***Failed 1.62 sec
```
**How to reproduce it (as minimally and precisely as possible)**:
Dune and DuMuX master, compiling and running the test.
**Anything else we need to know?**:
Maybe @melaniel can state the version & compiler details she used when generating the reference. I will be trying to reproduce a result tomorrow morning, too. Maybe it is compiler/machine dependent.
**Environment**:
- Dune version: master
- DuMux version: master
- Others: gcc 5.3 & gcc 7.4.4 (but with vel_1: 196.66% and vel_2: 135.84% difference) 3.1https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/783[test] test_ff_stokes_channel_neumann_x_dirichlet_y sometimes fails2019-11-12T13:42:57ZTimo Kochtimokoch@math.uio.no[test] test_ff_stokes_channel_neumann_x_dirichlet_y sometimes failswith the following error:
```
Fuzzy comparison...
Traceback (most recent call last):
File "/data/src/dumux/bin/testing/runtest.py", line 63, in
result = compare_vtk(args['files'][i*2], args['files'][(i*2)+1], relative=args['re...with the following error:
```
Fuzzy comparison...
Traceback (most recent call last):
File "/data/src/dumux/bin/testing/runtest.py", line 63, in
result = compare_vtk(args['files'][i*2], args['files'][(i*2)+1], relative=args['relative'], absolute=args['absolute'], zeroValueThreshold=args['zeroThreshold'])
File "/data/src/dumux/bin/testing/fuzzycomparevtu.py", line 52, in compare_vtk
root2 = ET.fromstring(open(vtk2).read())
File "/usr/lib/python2.7/xml/etree/ElementTree.py", line 1312, in XML
return parser.close()
File "/usr/lib/python2.7/xml/etree/ElementTree.py", line 1671, in close
self._raiseerror(v)
File "/usr/lib/python2.7/xml/etree/ElementTree.py", line 1523, in _raiseerror
raise err
xml.etree.ElementTree.ParseError: no element found: line 466, column 16
Test #11: test_ff_stokes_channel_neumann_x_dirichlet_y ............***Failed 5.80 sec
```
there seems to be something wrong with the vtu.3.1Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/820Failing tests after change of VanGenuchten Law2020-03-24T09:36:27ZKilian WeishauptFailing tests after change of VanGenuchten Law3fff0a215373e66b081ea544b7e37ef277ee082a introduces some changes to the VanGenuchten Law.
Several reference solutions were updated afterwards but on my system,
* test_md_boundary_darcy2p2cni_stokes1p2cni_horizontal
and
* test_2p_in...3fff0a215373e66b081ea544b7e37ef277ee082a introduces some changes to the VanGenuchten Law.
Several reference solutions were updated afterwards but on my system,
* test_md_boundary_darcy2p2cni_stokes1p2cni_horizontal
and
* test_2p_incompressible_box_ifsolver
still fail (while passing on buildbot). Could somebody else try those tests?3.2https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/847Rename forgotten wettingPhaseIdx() instances to wettingPhase()2020-03-31T10:08:41ZSimon EmmertRename forgotten wettingPhaseIdx() instances to wettingPhase()with d23929df wettingPhaseIdx() was renamed to wettingPhase()
There are still a few tests e.g. `test_thermalconductivity` that use the old wettingPhaseIdx()with d23929df wettingPhaseIdx() was renamed to wettingPhase()
There are still a few tests e.g. `test_thermalconductivity` that use the old wettingPhaseIdx()3.2https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/849Fixed derivative in VanGenuchten makes tests fail2020-04-01T08:03:44ZTimo Kochtimokoch@math.uio.noFixed derivative in VanGenuchten makes tests failIn 44e10bfc84cc047d5e0278cb2b8886d51eee3a52 the derivative of Van Genuchten `krn_swe` was fixed. I thought it's not used anywhere. I overlooked that it's actually used in the regularization of `krn` for small saturations.In 44e10bfc84cc047d5e0278cb2b8886d51eee3a52 the derivative of Van Genuchten `krn_swe` was fixed. I thought it's not used anywhere. I overlooked that it's actually used in the regularization of `krn` for small saturations.3.2Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/851Fix `test_2p2cni_waterair_buoyancy_box`2020-04-06T18:37:59ZTimo Kochtimokoch@math.uio.noFix `test_2p2cni_waterair_buoyancy_box`I accidentally "fixed" the reference solution for `test_2p2cni_waterair_buoyancy_box` in !1932.
On BuildBot it fails however.
I know from many people that they have problems with this test. It doesn't seem to be very stable.
We should h...I accidentally "fixed" the reference solution for `test_2p2cni_waterair_buoyancy_box` in !1932.
On BuildBot it fails however.
I know from many people that they have problems with this test. It doesn't seem to be very stable.
We should have a better look at the test, identify what can be done to make it both a
good test for the 2p2cni model and make it tractable for reliable testing.3.2Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/863test_2p_fracture_gravity_box fails with dune master2020-04-28T14:07:44ZTimo Kochtimokoch@math.uio.notest_2p_fracture_gravity_box fails with dune masterFails with Dune master and Dune 2.7 on BuildBot. Passes with Dune 2.6.
Need for debugging maybe bisecting the commit causing the failure. Maybe also fixed by using fixed time step sizes or more restrictive convergence criteria in case t...Fails with Dune master and Dune 2.7 on BuildBot. Passes with Dune 2.6.
Need for debugging maybe bisecting the commit causing the failure. Maybe also fixed by using fixed time step sizes or more restrictive convergence criteria in case this is related to the solver.
3.2Ned ColtmanNed Coltmanhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/870test_2pncminni_salinization_tpfa questionable and (sporadically failing) test2020-04-22T16:07:00ZTimo Kochtimokoch@math.uio.notest_2pncminni_salinization_tpfa questionable and (sporadically failing) testThe test `test_2pncminni_salinization_tpfa` supposedly simulates an evaporation scenario that should lead to salt precipitation. However during the runtime of the test nothing actually precipitates. Which raises the question: Does this a...The test `test_2pncminni_salinization_tpfa` supposedly simulates an evaporation scenario that should lead to salt precipitation. However during the runtime of the test nothing actually precipitates. Which raises the question: Does this actually test anything that is not tested in the 2pnc tests already?
This also currently leads to the test failing with certain configurations (see BuildBot) because no zero threshold is set for the precipitated volume fraction of NaCl. (Zero thresholds are always necessary if a parameter is exact zero everywhere because then there is no way to determine a reliable epsilon for floating point comparisons automatically.) This is also why this is marked for milestone 3.2.
I suggest to fix the test to actually test "salinization" as the name suggests.3.2Theresa SchollenbergerTheresa Schollenbergerhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/871[test] test_1pni_conduction_tpfa, test_2p_incompressible_tpfa_oilwet and test...2020-04-29T10:48:55ZTimo Kochtimokoch@math.uio.no[test] test_1pni_conduction_tpfa, test_2p_incompressible_tpfa_oilwet and test_3pni_conduction_tpfa fail with minimal setuptest_1pni_conduction_tpfa and test_2p_incompressible_tpfa_oilwet and test_3pni_conduction_tpfa fail with dune master with a strange (maybe solver-related) problem when dumux is run in minimal configuration (no MPI, no UMFPACK, nothing e...test_1pni_conduction_tpfa and test_2p_incompressible_tpfa_oilwet and test_3pni_conduction_tpfa fail with dune master with a strange (maybe solver-related) problem when dumux is run in minimal configuration (no MPI, no UMFPACK, nothing external, only dune core modules).
This may be related to the coarse grid solver of the AMG, which is usually UMFPACK but changed to SSOR when UMFPACK is not found. The SSOR has a low hard-coded solver tolerance. This solver tolerance is adjusted with a patch in dumux/patches/dune-istl-2.6.patch. The patch is supposed to address this issue and worked in the past. But doesn't seem to work now....
__versions__
* Dune 2.7 or master
* Dumux 3.2 or master3.2https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/889[freeflow][test] New convergence test fails2020-05-27T19:01:05ZTimo Kochtimokoch@math.uio.no[freeflow][test] New convergence test failsNew test merged in https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/merge_requests/2128.
Fails on BuildBot setup.
I suspect that the convergence script does not use unique filenames so the two tests overwrite each other when...New test merged in https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/merge_requests/2128.
Fails on BuildBot setup.
I suspect that the convergence script does not use unique filenames so the two tests overwrite each other when run in parallel. Possible bug in the Python script.3.3Kilian WeishauptKilian Weishaupthttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/894Problem with OpenMPI 4 in Docker container2021-02-05T14:10:23ZTimo Kochtimokoch@math.uio.noProblem with OpenMPI 4 in Docker containerThere are some security precautions enabled in the Docker container which keeps OpenMPI from doing it's job, see https://github.com/open-mpi/ompi/issues/4948. There are tons of errors/warnings produced looking like this
```
Read -1, exp...There are some security precautions enabled in the Docker container which keeps OpenMPI from doing it's job, see https://github.com/open-mpi/ompi/issues/4948. There are tons of errors/warnings produced looking like this
```
Read -1, expected <someNumber>, errno =1
Read -1, expected <someNumber>, errno =1
Read -1, expected <someNumber>, errno =1
...
```
I tried turning the offending mode of `OMPI_MCA_btl_vader_single_copy_mechanism=none` in the Docker (see https://git.iws.uni-stuttgart.de/timok/dumux-bb-ci/-/commit/3e5545ce7010f10219ebac9074faef207029df11) but the runtime seems to be affected a lot. The parallel test are much slower than the sequential tests. Another solution mentioned in https://github.com/open-mpi/ompi/issues/4948 requires to set some system permissions on the testing system. @root might be looking into that.
The issue started when I upgraded the Docker containers to use Ubuntu 20.04LTS. I did that to get the newer required CMake version as default (see https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/merge_requests/2136). 20.04LTS also comes per default with a newer OpenMPI version (4.0.3 I think). That setup shows the problem described in https://github.com/open-mpi/ompi/issues/4948.https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/899Pointsource tests fail after change on master2020-06-24T07:09:21ZTimo Kochtimokoch@math.uio.noPointsource tests fail after change on mastertest_richardsnc_tpfa, test_richardsnc_box, test_1p_pointsources_timedependent_tpfa fail after !2184.test_richardsnc_tpfa, test_richardsnc_box, test_1p_pointsources_timedependent_tpfa fail after !2184.Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/930Failing tests test_md_embedded1d3d_1p_richards_*2020-10-27T10:31:22ZTimo Kochtimokoch@math.uio.noFailing tests test_md_embedded1d3d_1p_richards_*master -> fails
release 3.2 -> fails
Error: Residual=nan in BiCGSTAB first time step, Newton fails to converge.
release 3.1 -> passesmaster -> fails
release 3.2 -> fails
Error: Residual=nan in BiCGSTAB first time step, Newton fails to converge.
release 3.1 -> passes3.3Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/941test_vtk_staggeredfreeflowpvnames fails on buildbot2020-10-23T20:19:13ZSimon Emmerttest_vtk_staggeredfreeflowpvnames fails on buildbot`test_vtk_staggeredfreeflowpvnames` aborts on buildbot. I think (but I did not check anything yet), this is due to !2224.
From looking at `dumux/test/io/vtk/test_vtk_staggeredfreeflowpvnames.cc` it seems like the navierstokes-tests work...`test_vtk_staggeredfreeflowpvnames` aborts on buildbot. I think (but I did not check anything yet), this is due to !2224.
From looking at `dumux/test/io/vtk/test_vtk_staggeredfreeflowpvnames.cc` it seems like the navierstokes-tests work, and the KEpsilon-test is the first one to abort with the following message:
`updateStaticWallProperties:/data/dumux/dumux/freeflow/rans/zeroeq/problem.hh:91]:
Due to grid/geometric concerns, zero-eq models should only be used for flat channel geometries.`
I guess this is an easy fix for @nedc or @melaniel
Log excerpt from buildbot:
``` 150/475 Test #132: test_vtk_staggeredfreeflowpvnames ..........................Child aborted***Exception: 2.18 sec
### # # # #
# # # # ## ## # # #
# # # # # # # # # # #
### ## # # ##
Dune for Multi-{ Phase,
Component,
Scale,
Physics,
...} flow and transport in porous media
The H2O-air fluid system was configured with the following policy:
- use H2O density as liquid mixture density: false
- use ideal gas density: false
- use air viscosity as gas mixture viscosity: false
-------------------------------------------------------------------------
Initializing tables for the H2O fluid properties (20000 entries).
Temperature -> min: 2.731e+02, max: 6.231e+02, n: 100
Pressure -> min: -1.000e+01, max: 2.000e+07, n: 200
-------------------------------------------------------------------------
Writing output for problem "navierstokes". Took 3.373e-01 seconds.
Warning: gasDensity(T=0, p=0) of component 'H2O' is outside tabulation range: (273.15<=T<=623.15), (-10<=p<=2e+07). Forwarded to FluidSystem for direct evaluation of gasDensity.
Writing output for problem "navierstokesni". Took 1.604e-01 seconds.
Writing output for problem "navierstokesnc". Took 1.692e-03 seconds.
Writing output for problem "navierstokesncni". Took 2.352e-03 seconds.
terminate called after throwing an instance of 'Dune::NotImplemented'
what(): Dune::NotImplemented [updateStaticWallProperties:/data/dumux/dumux/freeflow/rans/zeroeq/problem.hh:91]:
Due to grid/geometric concerns, zero-eq models should only be used for flat channel geometries.
If your geometry is a flat channel, please set the runtime parameter RANS.IsFlatWallBounded to true. ```3.3Ned ColtmanNed Coltmanhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/948test_3pwateroil_sagd_box fails2020-10-31T14:18:25ZKilian Weishaupttest_3pwateroil_sagd_box failsFails on hal with g++ (GCC) 10.2.0
```
416: Fuzzy comparison...
416: Comparing /temp/weishaupt/Dumux_testing_master/dumux/test/references/test_3pwateroil_sagd_box-reference.vtu and /temp/weishaupt/Dumux_testing_master/dumux/build-cmake...Fails on hal with g++ (GCC) 10.2.0
```
416: Fuzzy comparison...
416: Comparing /temp/weishaupt/Dumux_testing_master/dumux/test/references/test_3pwateroil_sagd_box-reference.vtu and /temp/weishaupt/Dumux_testing_master/dumux/build-cmake/test/porousmediumflow/3pwateroil/implicit/test_3pwateroil_sagd_box-00001.vtu
416: ... with a maximum relative error of 0.01 and a maximum absolute error of 1.5e-07*max_abs_parameter_value.
416:
416: Data differs in parameter: S_aq
416: Difference is too large: 3.35% -> between: 0.371884 and 0.384759
416: Info for S_aq: max_abs_parameter_value=0.701504 and min_abs_parameter_value=0.371884.
416:
416: Data differs in parameter: S_gas
416: Difference is too large: 3.23% -> between: 0.366778 and 0.35494
416: Info for S_gas: max_abs_parameter_value=0.366778 and min_abs_parameter_value=0.0.
416:
416: Data differs in parameter: mob_aq
416: Difference is too large: 14.05% -> between: 130.979 and 152.385
416: Info for mob_aq: max_abs_parameter_value=467.458 and min_abs_parameter_value=130.979.
416:
416: Data differs in parameter: mob_gas
416: Difference is too large: 5.47% -> between: 14871.0 and 14057.1
416: Info for mob_gas: max_abs_parameter_value=14871.0 and min_abs_parameter_value=0.0.
416:
416: Data differs in parameter: mob_napl
416: Difference is too large: 4.53% -> between: 0.0152723 and 0.0145809
416: Info for mob_napl: max_abs_parameter_value=6.10674 and min_abs_parameter_value=8.86194e-05.
416:
416: Data differs in parameter: mu_gas
416: Difference is too large: 3.13% -> between: 9.50177e-08 and 9.20431e-08
416: Info for mu_gas: max_abs_parameter_value=1.77629e-05 and min_abs_parameter_value=5.87006e-09.
416:
416: Data differs in parameter: mu_napl
416: Difference is too large: 4.65% -> between: 10.7806 and 11.3068
416: Info for mu_napl: max_abs_parameter_value=1747.3 and min_abs_parameter_value=0.00550324.
416:
416: Data differs in parameter: rho_gas
416: Difference is too large: 2.48% -> between: 0.234406 and 0.228589
416: Info for rho_gas: max_abs_parameter_value=23.2395 and min_abs_parameter_value=0.0194941.
416:
416: Data differs in parameter: x^H2O_gas
416: Difference is too large: 2.93% -> between: 0.00838062 and 0.00813482
416: Info for x^H2O_gas: max_abs_parameter_value=0.999998 and min_abs_parameter_value=0.000610972.
416:
416: Data differs in parameter: x^heavyoil_gas
416: Difference is too large: 1.37% -> between: 2.17444e-06 and 2.20465e-06
416: Info for x^heavyoil_gas: max_abs_parameter_value=2.20465e-06 and min_abs_parameter_value=0.0.
416: Fuzzy comparison done (not equal)
1/1 Test #416: test_3pwateroil_sagd_box .........***Failed 9.31 sec
```3.3https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/950test_ff_navierstokes_kovasznay fails2021-04-16T10:42:40ZMathis Kelmtest_ff_navierstokes_kovasznay failstest_ff_navierstokes_kovasznay fails for me with gcc 9.3.0.
As far as I can tell both UMFPack and dune-subgrid are installed, the test compiles despite UMFPack CMake guard and the macro HAVE_DUNE_SUBGRID evaluates to true.
The other two ...test_ff_navierstokes_kovasznay fails for me with gcc 9.3.0.
As far as I can tell both UMFPack and dune-subgrid are installed, the test compiles despite UMFPack CMake guard and the macro HAVE_DUNE_SUBGRID evaluates to true.
The other two tests in navierstokes/kovasznay pass but when executing test_ff_navierstokes_kovasznay with params.input I get the following error message.
```
Newton: Caught exception: "NumericalProblem [solveLinearSystem:dumux/dumux/nonlinear/newtonsolver.hh:454]: Linear solver did not converge"
terminate called after throwing an instance of 'Dumux::NumericalProblem'
what(): NumericalProblem [solve:dumux/dumux/nonlinear/newtonsolver.hh:331]: Newton solver didn't converge after 5 iterations.
[40442] *** Process received signal ***
[40442] Signal: Aborted (6)
[40442] Signal code: (-6)
[40442] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x153c0)[0x7f0de4a233c0]
[40442] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0xcb)[0x7f0de486218b]
[40442] [ 2] /lib/x86_64-linux-gnu/libc.so.6(abort+0x12b)[0x7f0de4841859]
[40442] [ 3] /lib/x86_64-linux-gnu/libstdc++.so.6(+0x9e951)[0x7f0de4aec951]
[40442] [ 4] /lib/x86_64-linux-gnu/libstdc++.so.6(+0xaa47c)[0x7f0de4af847c]
[40442] [ 5] /lib/x86_64-linux-gnu/libstdc++.so.6(+0xaa4e7)[0x7f0de4af84e7]
[40442] [ 6] /lib/x86_64-linux-gnu/libstdc++.so.6(+0xaa799)[0x7f0de4af8799]
[40442] [ 7] ./test_ff_navierstokes_kovasznay(+0x150d38)[0x55ec2723fd38]
[40442] [ 8] ./test_ff_navierstokes_kovasznay(+0x31bcd)[0x55ec27120bcd]
[40442] [ 9] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf3)[0x7f0de48430b3]
[40442] [10] ./test_ff_navierstokes_kovasznay(+0x336be)[0x55ec271226be]
[40442] *** End of error message ***
Aborted (core dumped)```3.4Mathis KelmMathis Kelmhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/954test_md_embedded_1d3d_1p1p_tpfatpfa_convergence fails, no numpy on BuildBot2020-11-05T16:51:01ZTimo Kochtimokoch@math.uio.notest_md_embedded_1d3d_1p1p_tpfatpfa_convergence fails, no numpy on BuildBot3.3https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/980Failing parallel tests2021-01-09T18:48:44ZTimo Kochtimokoch@math.uio.noFailing parallel testsParallel tests fail with
```
NotImplemented [get:/data/dune-istl/dune/istl/solverfactory.hh:190]: The solver factory is currently only implemented for sequential solvers!
```
for Dune 2.7, did we remove some version checks?Parallel tests fail with
```
NotImplemented [get:/data/dune-istl/dune/istl/solverfactory.hh:190]: The solver factory is currently only implemented for sequential solvers!
```
for Dune 2.7, did we remove some version checks?3.4Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/985Richards benchmark test fails due to assert in spline2021-02-01T12:29:22ZTimo Kochtimokoch@math.uio.noRichards benchmark test fails due to assert in splineProbably something with the regularization of krw/pc-sw. The regularized interval approximated by the spline might be too small? It could also be that the assert is wrong. Needs closer investigation.Probably something with the regularization of krw/pc-sw. The regularized interval approximated by the spline might be too small? It could also be that the assert is wrong. Needs closer investigation.3.4Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1025InvertCubicPolynomial test fails2021-05-10T14:35:37ZDennis GläserInvertCubicPolynomial test failsIn our pipeline, `test_math` regularly fails. According to GitLab it failed 4 times in the last 14 days - although I think we have the CI only since about a week. The `invertCubicPolynomial` test seems to be the cause of error, here is t...In our pipeline, `test_math` regularly fails. According to GitLab it failed 4 times in the last 14 days - although I think we have the CI only since about a week. The `invertCubicPolynomial` test seems to be the cause of error, here is the output from tonight:
```sh
terminate called after throwing an instance of 'Dune::Exception'
what(): Dune::Exception [operator():/builds/dumux-repositories/dumux/test/common/math/test_math.cc:282]: [invertCubicPolynomial] Root 0 of 3: -7.81946 does not match reference
```
The night before, for instance, it seems that it did not fail. @timok, I believe you implemented this? I haven't looked into the test yet, but could it be that the deviation threshold in the test is chosen too small?3.4Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1026Minimal CI Setup failing tests2021-05-10T12:22:33ZTimo Kochtimokoch@math.uio.noMinimal CI Setup failing tests421 - test_3p3cni_columnxylol_box (Failed)
427 - test_3pwateroil_sagd_box (Failed)
Probably related to the solver tolerance in the AMG without direct solver backend.421 - test_3p3cni_columnxylol_box (Failed)
427 - test_3pwateroil_sagd_box (Failed)
Probably related to the solver tolerance in the AMG without direct solver backend.3.4Dennis GläserDennis Gläserhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1034Segfault in McWorther test (currently untested)2021-08-16T12:23:18ZTimo Kochtimokoch@math.uio.noSegfault in McWorther test (currently untested)The test added in !2630 fails. Continuing discussion there.The test added in !2630 fails. Continuing discussion there.3.5Hanchuan WuHanchuan Wuhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1061Disable or fix Python tests in CI for release 3.42021-07-28T10:03:33ZTimo Kochtimokoch@math.uio.noDisable or fix Python tests in CI for release 3.4The new Docker images enable the Python bindings in the full container (previously untested). The Python bindings are tested if Dune >= 2.8.
However due to a configuration issue the tests currently fail to run/compile. Help is on the way...The new Docker images enable the Python bindings in the full container (previously untested). The Python bindings are tested if Dune >= 2.8.
However due to a configuration issue the tests currently fail to run/compile. Help is on the way in !2681 but this is not going to be part of 3.4. Therefore we should probably just manually disable the Python tests in the CI config on master+release/3.4. Or if the problem is solved in !2681 soon, add the fix to the current CI config.3.4https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1072Failure of richards benchmark2021-08-20T18:39:45ZTimo Kochtimokoch@math.uio.noFailure of richards benchmarkThere was a spurious failure of the Richards benchmark in the CI. No such failure had been previously observed in many runs. The job log is attached [job_log](/uploads/de092059c7504c36e9b73073c68aa798/job_log)
Seems to be quite a consis...There was a spurious failure of the Richards benchmark in the CI. No such failure had been previously observed in many runs. The job log is attached [job_log](/uploads/de092059c7504c36e9b73073c68aa798/job_log)
Seems to be quite a consistent failure ~~after !2736~~ when using the new runner on sal.Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1083[test] Gridmanager test fails with Dune `master`2021-10-06T09:31:09ZMathis Kelm[test] Gridmanager test fails with Dune `master`Recently the gridmanager test `test_gridmanager_dgf_ug_parallel` has been failing when run with Dune `master` through CI. Discrepancies appear to include element marker assignments, see the output below. Is this a core issue with our tes...Recently the gridmanager test `test_gridmanager_dgf_ug_parallel` has been failing when run with Dune `master` through CI. Discrepancies appear to include element marker assignments, see the output below. Is this a core issue with our test or just some runtime property that should not be tested (in this manner).
```
Rank 0: Reading parameters from file test_gridmanager_dgf.input.
Rank 1: Reading parameters from file test_gridmanager_dgf.input.
UGGridParameterBlock: Parameter 'closure' not specified, using default: 'GREEN'.
UGGridParameterBlock: Parameter 'copies' not specified, using default: 'NO'.
UGGridParameterBlock: Parameter 'closure' not specified, using default: 'GREEN'.
UGGridParameterBlock: Parameter 'copies' not specified, using default: 'NO'.
UGGridParameterBlock: Parameter 'closure' not specified, using default: 'GREEN'.
UGGridParameterBlock: Parameter 'copies' not specified, using default: 'NO'.
UGGridParameterBlock: Parameter 'closure' not specified, using default: 'GREEN'.
UGGridParameterBlock: Parameter 'copies' not specified, using default: 'NO'.
Fuzzy comparison...
Comparing /builds/dumux-repositories/dumux/test/references/gridmanager-co2-quad-element-reference.vtu and /builds/dumux-repositories/dumux/build-cmake/test/io/gridmanager/s0002-co2_ug_parallel-element-00000.pvtu
... with a maximum relative error of 0.01 and a maximum absolute error of 1.5e-07*max_abs_parameter_value.
Sorting vtu by coordinates...
Data differs in parameter: elementMarker
Difference is too large: 100.00% -> between: 3.0 and 0.0 Info for elementMarker: max_abs_parameter_value=3.0 and min_abs_parameter_value=0.0.
Fuzzy comparison done (not equal)
Fuzzy comparison...
Comparing /builds/dumux-repositories/dumux/test/references/gridmanager-co2-quad-element-reference-refined.vtu and /builds/dumux-repositories/dumux/build-cmake/test/io/gridmanager/s0002-co2_ug_parallel-element-00001.pvtu
... with a maximum relative error of 0.01 and a maximum absolute error of 1.5e-07*max_abs_parameter_value.
Sorting vtu by coordinates...
Data differs in parameter: elementMarker
Difference is too large: 100.00% -> between: 3.0 and 0.0 Info for elementMarker: max_abs_parameter_value=3.0 and min_abs_parameter_value=0.0.
Fuzzy comparison done (not equal)
Fuzzy comparison...
Comparing /builds/dumux-repositories/dumux/test/references/gridmanager-co2-quad-vertex-reference.vtu and /builds/dumux-repositories/dumux/build-cmake/test/io/gridmanager/s0002-co2_ug_parallel-vertex-00000.pvtu
... with a maximum relative error of 0.01 and a maximum absolute error of 1.5e-07*max_abs_parameter_value.
Sorting vtu by coordinates...
Data differs in parameter: vertexData
Difference is too large: 100.00% -> between: 96.0 and 0.0 Info for vertexData: max_abs_parameter_value=3793.0 and min_abs_parameter_value=0.0.
Fuzzy comparison done (not equal)
```https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1148Failed test_co2_tpfa/test_co2_tpfani2022-05-03T17:07:06ZTimo Kochtimokoch@math.uio.noFailed test_co2_tpfa/test_co2_tpfanirtest_co2_tpfa: /dune/modules/dune-alugrid/dune/alugrid/3d/entityseed.hh:78: static bool Dune::ALU3dGridEntitySeedBase<0, const Dune::ALU3dGrid<2, 2, Dune::hexa, Dune::ALUGridMPIComm> >::Bnd<0, ALUGrid::Gitter::hasFace>::isGhost(Dune::AL...rtest_co2_tpfa: /dune/modules/dune-alugrid/dune/alugrid/3d/entityseed.hh:78: static bool Dune::ALU3dGridEntitySeedBase<0, const Dune::ALU3dGrid<2, 2, Dune::hexa, Dune::ALUGridMPIComm> >::Bnd<0, ALUGrid::Gitter::hasFace>::isGhost(Dune::ALU3dGridEntitySeedBase::KeyType *) [codim = 0, GridImp = const Dune::ALU3dGrid<2, 2, Dune::hexa, Dune::ALUGridMPIComm>, cd = 0, Key = ALUGrid::Gitter::hasFace]: Assertion `key' failed.
with dune master + clang.3.5Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1150Unexpected interaction of fmt libraries2022-05-11T07:18:30ZTimo Kochtimokoch@math.uio.noUnexpected interaction of fmt librariesopm-common also ships fmt as we do. There seems to be some strange interaction such that when opm-common is present our unit test fails: https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/jobs/88066. Maybe the opm-shipped librar...opm-common also ships fmt as we do. There seems to be some strange interaction such that when opm-common is present our unit test fails: https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/jobs/88066. Maybe the opm-shipped library is included and there is a version mismatch of some sort.
```
121/545 Test #119: test_format ................................................Child aborted***Exception: 0.27 sec
terminate called after throwing an instance of 'Dune::Exception'
what(): Dune::Exception [testString:/builds/dumux-repositories/dumux/test/io/format/test_format.cc:15]: Unexpected result: Hubble's H₀ ≅42 miles/sec/mpc., expected Hubble's H₀ ≅ 42 miles/sec/mpc.
```3.5Bernd FlemischBernd Flemischhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1153Spurious segfault of test_1p_convergence_box2022-05-27T08:31:03ZTimo Kochtimokoch@math.uio.noSpurious segfault of test_1p_convergence_box```
315/545 Test #314: test_1p_convergence_box_conforming .........................***Exception: SegFault 0.55 sec
670Chuck Norris has successfully compiled DuMuX.
671Reading parameters from file params.input.
672 --- Solving finest sol...```
315/545 Test #314: test_1p_convergence_box_conforming .........................***Exception: SegFault 0.55 sec
670Chuck Norris has successfully compiled DuMuX.
671Reading parameters from file params.input.
672 --- Solving finest solution (dx = 0.0125) ---
673Computed bounding box tree with 12799 nodes for 6400 grid entites in 0.00150703 seconds.
674Colored 6400 elements with 4 colors in 0.001164546 seconds.
675Assemble: r(x^k) = dS/dt + div F - q; M = grad r -- Using the default temperature of 293.15 in the entire domain. Overload temperatureAtPos() in your spatial params class to define a custom temperature field.Or provide the preferred domain temperature via the SpatialParams.Temperature parameter.
676Update: x^(k+1) = x^k - deltax^kAssemble/solve/update time: 0.0062936(24.8438%)/0.0190219(75.0882%)/1.723e-05(0.0680149%)
677Writing output for problem "test_1p_convergence_box_conforming". Took 0.025 seconds.
678 --- Solving with dx = 10 ---
679Computed bounding box tree with 199 nodes for 100 grid entites in 2.2181e-05 seconds.
680Colored 100 elements with 4 colors in 2.577e-05 seconds.
681Assemble: r(x^k) = dS/dt + div F - q; M = grad r[runner-d5rncghq-project-31-concurrent-0:03562] *** Process received signal ***
682[runner-d5rncghq-project-31-concurrent-0:03562] Signal: Segmentation fault (11)
683[runner-d5rncghq-project-31-concurrent-0:03562] Signal code: Address not mapped (1)
684[runner-d5rncghq-project-31-concurrent-0:03562] Failing at address: 0x8
685[runner-d5rncghq-project-31-concurrent-0:03562] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x14420)[0x7f3da283b420]
686[runner-d5rncghq-project-31-concurrent-0:03562] [ 1] /lib/x86_64-linux-gnu/libstdc++.so.6(_ZSt18_Rb_tree_decrementPSt18_Rb_tree_node_base+0xe)[0x7f3da220125e]
687
```
Most likely some multithreading issue. `test_1p_convergence_box_nonconforming` has the same spurious failures.
Here is another one from master.
```
315/545 Test #315: test_1p_convergence_box_nonconforming ......................***Exception: SegFault 0.58 sec
Let's get the cow off the ice.
Reading parameters from file params.input.
--- Solving finest solution (dx = 0.0125) ---
Computed bounding box tree with 12799 nodes for 6400 grid entites in 0.00205371 seconds.
Colored 6400 elements with 4 colors in 0.001878195 seconds.
Assemble: r(x^k) = dS/dt + div F - q; M = grad r -- Using the default temperature of 293.15 in the entire domain. Overload temperatureAtPos() in your spatial params class to define a custom temperature field.Or provide the preferred domain temperature via the SpatialParams.Temperature parameter.
Update: x^(k+1) = x^k - deltax^kAssemble/solve/update time: 0.00675025(21.4533%)/0.0246765(78.4254%)/3.819e-05(0.121373%)
Writing output for problem "test_1p_convergence_box_conforming". Took 0.049 seconds.
--- Solving with dx = 10 ---
Computed bounding box tree with 199 nodes for 100 grid entites in 2.8831e-05 seconds.
Colored 100 elements with 4 colors in 3.9071e-05 seconds.
Assemble: r(x^k) = dS/dt + div F - q; M = grad r[runner-d5rncghq-project-31-concurrent-0:03605] *** Process received signal ***
[runner-d5rncghq-project-31-concurrent-0:03605] Signal: Segmentation fault (11)
[runner-d5rncghq-project-31-concurrent-0:03605] Signal code: Address not mapped (1)
[runner-d5rncghq-project-31-concurrent-0:03605] Failing at address: 0x8
[runner-d5rncghq-project-31-concurrent-0:03605] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x14420)[0x7f44a89eb420]
[runner-d5rncghq-project-31-concurrent-0:03605] [ 1] /lib/x86_64-linux-gnu/libstdc++.so.6(_ZSt18_Rb_tree_decrementPSt18_Rb_tree_node_base+0xe)[0x7f44a83b125e]
[runner-d5rncghq-project-31-concurrent-0:03605] [ 2] /builds/dumux-repositories/dumux/build-cmake/test/porousmediumflow/1p/convergence/discretesolution/test_1p_convergence_box(+0xde397)[0x56271ee9a397]
[runner-d5rncghq-project-31-concurrent-0:03605] [ 3] /builds/dumux-repositories/dumux/build-cmake/test/porousmediumflow/1p/convergence/discretesolution/test_1p_convergence_box(+0xdea8f)[0x56271ee9aa8f]
[runner-d5rncghq-project-31-concurrent-0:03605] [ 4] /builds/dumux-repositories/dumux/build-cmake/test/porousmediumflow/1p/convergence/discretesolution/test_1p_convergence_box(+0x103083)[0x56271eebf083]
[runner-d5rncghq-project-31-concurrent-0:03605] [ 5] /builds/dumux-repositories/dumux/build-cmake/test/porousmediumflow/1p/convergence/discretesolution/test_1p_convergence_box(+0x105f82)[0x56271eec1f82]
[runner-d5rncghq-project-31-concurrent-0:03605] [ 6] /lib/x86_64-linux-gnu/libgomp.so.1(+0x1a78e)[0x7f44a896b78e]
[runner-d5rncghq-project-31-concurrent-0:03605] [ 7] /lib/x86_64-linux-gnu/libpthread.so.0(+0x8609)[0x7f44a89df609]
[runner-d5rncghq-project-31-concurrent-0:03605] [ 8] /lib/x86_64-linux-gnu/libc.so.6(clone+0x43)[0x7f44a80b0133]
[runner-d5rncghq-project-31-concurrent-0:03605] *** End of error message ***
```3.5Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1158test_1pnc_maxwellstefan_tpfa (Failed)2022-06-01T19:54:38ZYue Wangyue.wang@iws.uni-stuttgart.detest_1pnc_maxwellstefan_tpfa (Failed)<!--
This form is for bug reports ONLY!
If you're looking for help check out the [readme](/README.md).
-->
**Bug report**
The test_1pnc_maxwellstefan_tpfa needs 20 steps on my laptop, but the reference is compared with the 19th output ...<!--
This form is for bug reports ONLY!
If you're looking for help check out the [readme](/README.md).
-->
**Bug report**
The test_1pnc_maxwellstefan_tpfa needs 20 steps on my laptop, but the reference is compared with the 19th output in makefile. After I changed compared output in makefile, there is still a small discrepancy leading to failure.
**Environment**:
- Dune version: 2.8
- DuMux version: release/3.5
- OS Version: macOS 11.6.5
- Compiler Version: gcc 11.3.0
- Others:
Output
```
ctest -R test_1pnc_maxwellstefan_tpfa --rerun-failed --output-on-failure
Test project /Users/ouetsu/dumuxday/dumux/build-cmake
Start 383: test_1pnc_maxwellstefan_tpfa
1/1 Test #383: test_1pnc_maxwellstefan_tpfa .....***Failed 1.75 sec
/Users/ouetsu/dumuxday/dumux/build-cmake/test/porousmediumflow/1pnc/1p3c/test_1pnc_maxwellstefan_tpfa-00020.vtu
In the beginning the Universe was created. This has made a lot of people very angry and has been widely regarded as a bad move.!
- Douglas Adams, HGttG
Reading parameters from file params.input.
Computed bounding box tree with 1799 nodes for 900 grid entites in 0.000222 seconds.
problem uses mole fractions
-- Using the default temperature of 293.15 in the entire domain. Overload temperatureAtPos() in your spatial params class to define a custom temperature field.Or provide the preferred domain temperature via the SpatialParams.Temperature parameter.
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.013 seconds.
Colored 900 elements with 7 colors in 0.000518 seconds.
Newton solver configured with the following options and parameters:
-- Newton.EnableShiftCriterion = true (relative shift convergence criterion)
-- Newton.MaxRelativeShift = 1e-11
-- Newton.MinSteps = 2
-- Newton.MaxSteps = 18
-- Newton.TargetSteps = 10
-- Newton.RetryTimeStepReductionFactor = 0.5
-- Newton.MaxTimeStepDivisions = 10
Newton iteration 1 done, maximum relative shift = 1.2000e-02
Newton iteration 2 done, maximum relative shift = 6.2859e-03
Newton iteration 3 done, maximum relative shift = 3.8102e-06
Newton iteration 4 done, maximum relative shift = 1.2834e-11
Newton iteration 5 done, maximum relative shift = 1.7347e-16
Assemble/solve/update time: 0.034(56.20%)/0.026(43.60%)/0.00012(0.20%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.014 seconds.
[ 0%] Time step 1 done in 0.06 seconds. Wall clock time: 0.07426, time: 1, time step size: 1
Newton iteration 1 done, maximum relative shift = 2.1265e-02
Newton iteration 2 done, maximum relative shift = 8.3938e-05
Newton iteration 3 done, maximum relative shift = 6.9125e-09
Newton iteration 4 done, maximum relative shift = 7.0832e-14
Assemble/solve/update time: 0.015(44.14%)/0.019(55.68%)/6.1e-05(0.18%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0092 seconds.
[ 0%] Time step 2 done in 0.049 seconds. Wall clock time: 0.1179, time: 2.4167, time step size: 1.4167
Newton iteration 1 done, maximum relative shift = 2.4593e-02
Newton iteration 2 done, maximum relative shift = 1.4577e-04
Newton iteration 3 done, maximum relative shift = 2.1208e-08
Newton iteration 4 done, maximum relative shift = 4.2510e-13
Assemble/solve/update time: 0.014(45.79%)/0.016(54.05%)/4.8e-05(0.16%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0092 seconds.
[ 0%] Time step 3 done in 0.04 seconds. Wall clock time: 0.1575, time: 4.5417, time step size: 2.125
Newton iteration 1 done, maximum relative shift = 2.6520e-02
Newton iteration 2 done, maximum relative shift = 2.1205e-04
Newton iteration 3 done, maximum relative shift = 4.0424e-08
Newton iteration 4 done, maximum relative shift = 9.3259e-14
Assemble/solve/update time: 0.013(43.73%)/0.016(56.03%)/6.8e-05(0.23%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0089 seconds.
[ 0%] Time step 4 done in 0.039 seconds. Wall clock time: 0.19582, time: 7.7292, time step size: 3.1875
Newton iteration 1 done, maximum relative shift = 2.6662e-02
Newton iteration 2 done, maximum relative shift = 2.6987e-04
Newton iteration 3 done, maximum relative shift = 3.8616e-08
Newton iteration 4 done, maximum relative shift = 1.3961e-13
Assemble/solve/update time: 0.012(41.74%)/0.017(57.99%)/7.9e-05(0.27%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0094 seconds.
[ 0%] Time step 5 done in 0.038 seconds. Wall clock time: 0.2344, time: 12.51, time step size: 4.7812
Newton iteration 1 done, maximum relative shift = 2.5129e-02
Newton iteration 2 done, maximum relative shift = 3.2645e-04
Newton iteration 3 done, maximum relative shift = 3.8414e-08
Newton iteration 4 done, maximum relative shift = 2.4670e-12
Assemble/solve/update time: 0.012(44.47%)/0.015(55.35%)/4.9e-05(0.18%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.009 seconds.
[ 1%] Time step 6 done in 0.037 seconds. Wall clock time: 0.27058, time: 19.682, time step size: 7.1719
Newton iteration 1 done, maximum relative shift = 2.2381e-02
Newton iteration 2 done, maximum relative shift = 3.3690e-04
Newton iteration 3 done, maximum relative shift = 7.9237e-08
Newton iteration 4 done, maximum relative shift = 2.3762e-12
Assemble/solve/update time: 0.012(45.95%)/0.014(53.87%)/4.7e-05(0.18%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0088 seconds.
[ 1%] Time step 7 done in 0.036 seconds. Wall clock time: 0.3059, time: 30.44, time step size: 10.758
Newton iteration 1 done, maximum relative shift = 2.3015e-02
Newton iteration 2 done, maximum relative shift = 3.0702e-04
Newton iteration 3 done, maximum relative shift = 9.6664e-08
Newton iteration 4 done, maximum relative shift = 1.0971e-11
Newton iteration 5 done, maximum relative shift = 1.1657e-15
Assemble/solve/update time: 0.015(41.69%)/0.021(57.99%)/0.00011(0.32%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0088 seconds.
[ 1%] Time step 8 done in 0.045 seconds. Wall clock time: 0.35098, time: 46.577, time step size: 16.137
Newton iteration 1 done, maximum relative shift = 2.3471e-02
Newton iteration 2 done, maximum relative shift = 2.3345e-04
Newton iteration 3 done, maximum relative shift = 6.3069e-08
Newton iteration 4 done, maximum relative shift = 1.1634e-11
Newton iteration 5 done, maximum relative shift = 9.4369e-16
Assemble/solve/update time: 0.015(43.41%)/0.019(56.41%)/6.1e-05(0.18%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.01 seconds.
[ 2%] Time step 9 done in 0.043 seconds. Wall clock time: 0.39514, time: 69.437, time step size: 22.86
Newton iteration 1 done, maximum relative shift = 2.2485e-02
Newton iteration 2 done, maximum relative shift = 1.8337e-04
Newton iteration 3 done, maximum relative shift = 4.9848e-08
Newton iteration 4 done, maximum relative shift = 7.6620e-12
Assemble/solve/update time: 0.012(40.34%)/0.017(59.44%)/6.4e-05(0.22%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0097 seconds.
[ 3%] Time step 10 done in 0.039 seconds. Wall clock time: 0.43413, time: 101.82, time step size: 32.385
Newton iteration 1 done, maximum relative shift = 2.2842e-02
Newton iteration 2 done, maximum relative shift = 1.7489e-04
Newton iteration 3 done, maximum relative shift = 5.3192e-08
Newton iteration 4 done, maximum relative shift = 2.1477e-13
Assemble/solve/update time: 0.012(44.81%)/0.015(54.88%)/8.6e-05(0.31%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0092 seconds.
[ 4%] Time step 11 done in 0.037 seconds. Wall clock time: 0.47081, time: 150.4, time step size: 48.578
Newton iteration 1 done, maximum relative shift = 2.3759e-02
Newton iteration 2 done, maximum relative shift = 1.7493e-04
Newton iteration 3 done, maximum relative shift = 5.4479e-08
Newton iteration 4 done, maximum relative shift = 2.9345e-13
Assemble/solve/update time: 0.012(43.70%)/0.015(55.89%)/0.00011(0.41%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0089 seconds.
[ 6%] Time step 12 done in 0.037 seconds. Wall clock time: 0.50748, time: 223.27, time step size: 72.867
Newton iteration 1 done, maximum relative shift = 2.3978e-02
Newton iteration 2 done, maximum relative shift = 1.8186e-04
Newton iteration 3 done, maximum relative shift = 5.9748e-08
Newton iteration 4 done, maximum relative shift = 3.3276e-13
Assemble/solve/update time: 0.012(40.62%)/0.018(59.11%)/8.2e-05(0.27%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0089 seconds.
[ 9%] Time step 13 done in 0.039 seconds. Wall clock time: 0.54644, time: 332.57, time step size: 109.3
Newton iteration 1 done, maximum relative shift = 2.4229e-02
Newton iteration 2 done, maximum relative shift = 1.9660e-04
Newton iteration 3 done, maximum relative shift = 6.2956e-08
Newton iteration 4 done, maximum relative shift = 7.8290e-13
Assemble/solve/update time: 0.012(43.12%)/0.015(56.60%)/7.8e-05(0.29%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0094 seconds.
[ 14%] Time step 14 done in 0.036 seconds. Wall clock time: 0.58331, time: 496.52, time step size: 163.95
Newton iteration 1 done, maximum relative shift = 2.4639e-02
Newton iteration 2 done, maximum relative shift = 2.2045e-04
Newton iteration 3 done, maximum relative shift = 8.7346e-08
Newton iteration 4 done, maximum relative shift = 1.3128e-12
Assemble/solve/update time: 0.013(39.48%)/0.02(60.35%)/5.6e-05(0.17%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0091 seconds.
[ 21%] Time step 15 done in 0.042 seconds. Wall clock time: 0.62524, time: 742.45, time step size: 245.93
Newton iteration 1 done, maximum relative shift = 2.4849e-02
Newton iteration 2 done, maximum relative shift = 2.6070e-04
Newton iteration 3 done, maximum relative shift = 1.5927e-07
Newton iteration 4 done, maximum relative shift = 4.2347e-11
Newton iteration 5 done, maximum relative shift = 1.5543e-15
Assemble/solve/update time: 0.017(42.04%)/0.023(57.71%)/0.0001(0.25%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0096 seconds.
[ 31%] Time step 16 done in 0.049 seconds. Wall clock time: 0.67486, time: 1111.3, time step size: 368.89
Newton iteration 1 done, maximum relative shift = 2.4659e-02
Newton iteration 2 done, maximum relative shift = 3.1316e-04
Newton iteration 3 done, maximum relative shift = 2.1819e-07
Newton iteration 4 done, maximum relative shift = 8.1444e-11
Newton iteration 5 done, maximum relative shift = 2.7649e-15
Assemble/solve/update time: 0.016(36.62%)/0.027(63.20%)/7.8e-05(0.18%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.01 seconds.
[ 45%] Time step 17 done in 0.053 seconds. Wall clock time: 0.72813, time: 1633.9, time step size: 522.6
Newton iteration 1 done, maximum relative shift = 2.6992e-02
Newton iteration 2 done, maximum relative shift = 3.3205e-04
Newton iteration 3 done, maximum relative shift = 4.3039e-07
Newton iteration 4 done, maximum relative shift = 1.6188e-11
Newton iteration 5 done, maximum relative shift = 3.7192e-15
Assemble/solve/update time: 0.016(40.42%)/0.024(59.43%)/6.2e-05(0.15%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0095 seconds.
[ 66%] Time step 18 done in 0.051 seconds. Wall clock time: 0.77863, time: 2374.3, time step size: 740.34
Newton iteration 1 done, maximum relative shift = 3.2737e-02
Newton iteration 2 done, maximum relative shift = 2.8143e-04
Newton iteration 3 done, maximum relative shift = 2.3822e-07
Newton iteration 4 done, maximum relative shift = 4.4269e-11
Newton iteration 5 done, maximum relative shift = 4.0246e-15
Assemble/solve/update time: 0.016(37.61%)/0.026(62.25%)/5.9e-05(0.14%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0094 seconds.
[ 95%] Time step 19 done in 0.052 seconds. Wall clock time: 0.83014, time: 3423.1, time step size: 1048.8
Newton iteration 1 done, maximum relative shift = 5.3190e-03
Newton iteration 2 done, maximum relative shift = 2.0091e-06
Newton iteration 3 done, maximum relative shift = 4.3137e-10
Newton iteration 4 done, maximum relative shift = 1.9984e-15
Assemble/solve/update time: 0.013(43.82%)/0.017(56.01%)/5.1e-05(0.17%)
Writing output for problem "test_1pnc_maxwellstefan_tpfa". Took 0.0094 seconds.
[100%] Time step 20 done in 0.04 seconds. Wall clock time: 0.87, time: 3600, time step size: 176.9
Simulation took 0.87 seconds on 1 processes.
The cumulative CPU time was 0.87 seconds.
Forty-two. I checked it very thoroughly, and that quite definitely is the answer. I think the problem, to be quite honest with you, is that you've never actually known what the question is.
- Douglas Adams, HGttG
Fuzzy comparison...
Comparing /Users/ouetsu/dumuxday/dumux/test/references/test_1pnc_maxwellstefan_tpfa-reference.vtu and /Users/ouetsu/dumuxday/dumux/build-cmake/test/porousmediumflow/1pnc/1p3c/test_1pnc_maxwellstefan_tpfa-00020.vtu
... with a maximum relative error of 0.01 and a maximum absolute error of 1.5e-07*max_abs_parameter_value.
Data differs in parameter: delp
Difference is too large: 1.54% -> between: -0.000176918 and -0.000174189 Info for delp: max_abs_parameter_value=0.0211596 and min_abs_parameter_value=0.000174189.
Data differs in parameter: velocity_Gas (m/s)_0
Difference is too large: 1.28% -> between: 6.40399e-06 and 6.32196e-06 Info for velocity_Gas (m/s)_0: max_abs_parameter_value=6.44928e-06 and min_abs_parameter_value=2.75056e-07.
Fuzzy comparison done (not equal)
0% tests passed, 1 tests failed out of 1
Label Time Summary:
1pnc = 1.75 sec*proc (1 test)
porousmediumflow = 1.75 sec*proc (1 test)
Total Test time (real) = 1.96 sec
The following tests FAILED:
383 - test_1pnc_maxwellstefan_tpfa (Failed)
Errors while running CTest
```3.5Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1180New Compositional Staggered2023-10-26T08:13:30ZNed ColtmanNew Compositional Staggeredmentioned in #1115
Implementation and ported tests addressed in !2986 .
New Analytical solution and convergence test implemented in !3044 . Also implemented for old staggered in !3561 .
Troubleshooting branch: `feature/misc_naviersto...mentioned in #1115
Implementation and ported tests addressed in !2986 .
New Analytical solution and convergence test implemented in !3044 . Also implemented for old staggered in !3561 .
Troubleshooting branch: `feature/misc_navierstokesnc` .
Using this issue to collect all of the ideas and progress we have made (the MRs are not organized for this)3.8Ned ColtmanNed Coltmanhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1185TabularizedComponent is not thread-safe2022-10-04T14:02:06ZTimo Kochtimokoch@math.uio.noTabularizedComponent is not thread-safeI've `test_1pncni_transientbc_tpfa_caching` failing spuriously/rarely and the error message is usually not conclusive.
See e.g. https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/jobs/120228 were an error is thrown.I've `test_1pncni_transientbc_tpfa_caching` failing spuriously/rarely and the error message is usually not conclusive.
See e.g. https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/jobs/120228 were an error is thrown.3.6Mathis KelmMathis Kelmhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1308test_ff_stokes_channel_3d_nonuniform_diamond fails with timeout2023-11-13T07:36:27ZTimo Kochtimokoch@math.uio.notest_ff_stokes_channel_3d_nonuniform_diamond fails with timeoutSeems to be hanging sometimes: https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/jobs/223698Seems to be hanging sometimes: https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/jobs/2236983.8