dumux issueshttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues2023-03-25T13:25:15Zhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1236Errors with generate_parameterlist2023-03-25T13:25:15ZHamza OukiliErrors with generate_parameterlistWe have errors when we run `bin/doc/generate_parameterlist.py`
`WARNING 2 parameter(s) in file /home/hamza/work/repDuMux/DuMuXtest210323dune29/dumux/dumux/dumux/flux/shallowwaterviscousflux.hh could not be retrieved automatically. Plea...We have errors when we run `bin/doc/generate_parameterlist.py`
`WARNING 2 parameter(s) in file /home/hamza/work/repDuMux/DuMuXtest210323dune29/dumux/dumux/dumux/flux/shallowwaterviscousflux.hh could not be retrieved automatically. Please check them...
WARNING -> line 137: static const auto backgroundKinematicViscosity = getParamFromGroup<Scalar>(
WARNING -> error message: Could not correctly process parameter name
WARNING -> line 142: static const auto useMixingLengthTurbulenceModel = getParamFromGroup<bool>(
WARNING -> error message: Could not correctly process parameter name
WARNING 1 parameter(s) in file /home/hamza/work/repDuMux/DuMuXtest210323dune29/dumux/dumux/dumux/assembly/cvfelocalassembler.hh could not be retrieved automatically. Please check them...
WARNING -> line 376: static const bool updateAllVolVars = getParamFromGroup<bool>(
WARNING -> error message: Could not correctly process parameter name
WARNING 1 parameter(s) in file /home/hamza/work/repDuMux/DuMuXtest210323dune29/dumux/dumux/dumux/linear/stokes_solver.hh could not be retrieved automatically. Please check them...
WARNING -> line 114: const auto mode = getParamFromGroup<std::string>(
WARNING -> error message: Could not correctly process parameter name
ERROR Found parameter 'ShallowWater.TurbulentViscosity' in /home/hamza/work/repDuMux/DuMuXtest210323dune29/dumux/dumux/doc/doxygen/extradoc/parameters.json which has not been found in the code --> Set mode to 'manual' in the input file if it is to be kept otherwise delete it!
ERROR Found parameter 'ShallowWater.UseMixingLengthTurbulenceModel' in /home/hamza/work/repDuMux/DuMuXtest210323dune29/dumux/dumux/doc/doxygen/extradoc/parameters.json which has not been found in the code --> Set mode to 'manual' in the input file if it is to be kept otherwise delete it!
ERROR Missing input for parameter 'LinearSolver.DirectSolverForVelocity' in /home/hamza/work/repDuMux/DuMuXtest210323dune29/dumux/dumux/doc/doxygen/extradoc/parameters.json.
ERROR Missing input for parameter 'LinearSolver.Preconditioner.MassMatrixWeight' in /home/hamza/work/repDuMux/DuMuXtest210323dune29/dumux/dumux/doc/doxygen/extradoc/parameters.json.
ERROR Missing input for parameter 'LinearSolver.SymmetrizeDirichlet' in /home/hamza/work/repDuMux/DuMuXtest210323dune29/dumux/dumux/doc/doxygen/extradoc/parameters.json.
`3.7Yue Wangyue.wang@iws.uni-stuttgart.deYue Wangyue.wang@iws.uni-stuttgart.dehttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1235Remove deprecation warnings related to "fvGeometry.geometry(scvf)" before rel...2023-03-23T07:05:43ZHamza OukiliRemove deprecation warnings related to "fvGeometry.geometry(scvf)" before release 3.7There are multiple similar warnings when building tests: is deprecated: Will be removed after 3.7. Use fvGeometry.geometry(scvf).
`/dumux/dumux/test/freeflow/navierstokes/angeli/main.cc:137:28: required from here
/dumux/dumux/test/fre...There are multiple similar warnings when building tests: is deprecated: Will be removed after 3.7. Use fvGeometry.geometry(scvf).
`/dumux/dumux/test/freeflow/navierstokes/angeli/main.cc:137:28: required from here
/dumux/dumux/test/freeflow/navierstokes/angeli/problem.hh:308:41: warning: ‘Dumux::CCTpfaSubControlVolumeFace<GV, T>::Geometry Dumux::CCTpfaSubControlVolumeFace<GV, T>::geometry() const [with GV = Dune::GridView<Dune::DefaultLeafGridViewTraits<const Dune::YaspGrid<2, Dune::EquidistantOffsetCoordinates<double, 2> > > >; T = Dumux::CCTpfaDefaultScvfGeometryTraits<Dune::GridView<Dune::DefaultLeafGridViewTraits<const Dune::YaspGrid<2, Dune::EquidistantOffsetCoordinates<double, 2> > > > >; Dumux::CCTpfaSubControlVolumeFace<GV, T>::Geometry = Dune::MultiLinearGeometry<double, 1, 2, Dumux::CCTpfaDefaultScvfGeometryTraits<Dune::GridView<Dune::DefaultLeafGridViewTraits<const Dune::YaspGrid<2, Dune::EquidistantOffsetCoordinates<double, 2> > > > >::ScvfMLGTraits<double> >]’ is deprecated: Will be removed after 3.7. Use fvGeometry.geometry(scvf). [-Wdeprecated-declarations]
308 | const auto geo = entity.geometry();
| ~~~~~~~~~~~~~~~^~
In file included from /dumux/dumux/dumux/discretization/cellcentered/tpfa/fvgridgeometry.hh:41,
from /dumux/dumux/dumux/discretization/cctpfa.hh:40,
from /dumux/dumux/test/freeflow/navierstokes/angeli/properties.hh:36,
from /dumux/dumux/test/freeflow/navierstokes/angeli/main.cc:56:
/dumux/dumux/dumux/discretization/cellcentered/tpfa/subcontrolvolumeface.hh:205:14: note: declared here
205 | Geometry geometry() const
| ^~~~~~~~
`3.7Hanchuan WuHanchuan Wuhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1234Cppcheck [assignBoolToPointer]2023-03-16T09:48:23ZHamza OukiliCppcheck [assignBoolToPointer]When running the Cppcheck (static code analyzer) on dumux and including the headers (option -I). It reports 2 errors related to [assignBoolToPointer]. Please see the image below.
Note: It might be a false positive.
![image](/uploads/a46...When running the Cppcheck (static code analyzer) on dumux and including the headers (option -I). It reports 2 errors related to [assignBoolToPointer]. Please see the image below.
Note: It might be a false positive.
![image](/uploads/a4612a9c7cd4269c7901ac2166b00895/image.png)Hamza OukiliHamza Oukilihttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1233[doxygen] broken link to modules.html with doxygen 1.9.62023-03-08T11:04:19ZLeopold Stadler[doxygen] broken link to modules.html with doxygen 1.9.6**Description**
There is is a broken link to `modules.html` from `index.xhtml` when creating the doxygen documentation on Arch Linux with doxygen version 1.9.6 The documentation on the dumux.org is built with doxygen version 1.9.3
**W...**Description**
There is is a broken link to `modules.html` from `index.xhtml` when creating the doxygen documentation on Arch Linux with doxygen version 1.9.6 The documentation on the dumux.org is built with doxygen version 1.9.3
**What happened / Problem description**:
When the documentation is built with `make doc` on a Arch Linux system, doxygen creates `*.xhtml` instead of `*.html` pages.
The doxygen mainpage is generated from the file `mainpage.txt` where the link to the Modules is set with `<a href="modules.html">Modules</a>`
**Possible solutions**
- Replace the static link in mainpage.txt to ensure that the documentation is built with newer versions of doxygen.
- Force doxygen to generate html files instead of xhtml files.
**Environment**:
- Dune version: -
- DuMux version: master
- Others: doxygen --version 1.9.6https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1232Remove deprecation warnings in tests related to "linear solver's norm" before...2023-03-09T08:53:37ZHamza OukiliRemove deprecation warnings in tests related to "linear solver's norm" before release 3.7There are multiple similar warnings in the tests: Use the linear solver's norm.
`/dumux/test/porousmediumflow/tracer/2ptracer/main.cc:203:30: required from here
/dumux/dumux/nonlinear/newtonsolver.hh:194:40: warning: ‘Dumux::FVAssemble...There are multiple similar warnings in the tests: Use the linear solver's norm.
`/dumux/test/porousmediumflow/tracer/2ptracer/main.cc:203:30: required from here
/dumux/dumux/nonlinear/newtonsolver.hh:194:40: warning: ‘Dumux::FVAssembler<TypeTag, diffMethod, isImplicit>::Scalar Dumux::FVAssembler<TypeTag, diffMethod, isImplicit>::normOfResidual(Dumux::FVAssembler<TypeTag, diffMethod, isImplicit>::ResidualType&) const [with TypeTag = Dumux::Properties::TTag::TwoPIncompressibleTpfa; Dumux::DiffMethod diffMethod = Dumux::DiffMethod::numeric; bool isImplicit = true; Dumux::FVAssembler<TypeTag, diffMethod, isImplicit>::Scalar = double; Dumux::FVAssembler<TypeTag, diffMethod, isImplicit>::ResidualType = Dune::BlockVector<Dune::FieldVector<double, 2>, std::allocator<Dune::FieldVector<double, 2> > >]’ is deprecated: Use the linear solver's norm. Will be deleted after 3.7 [-Wdeprecated-declarations]
194 | return assembler.normOfResidual(residual);
| ~~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~
In file included from /dumux/test/porousmediumflow/tracer/2ptracer/main.cc:42:
/dumux/dumux/assembly/fvassembler.hh:249:12: note: declared here
249 | Scalar normOfResidual(ResidualType& residual) const
| ^~~~~~~~~~~~~~
`3.7https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1231Remove deprecation warnings related to "NavierStokesMomentumCVFE" before rele...2023-03-22T14:16:03ZHamza OukiliRemove deprecation warnings related to "NavierStokesMomentumCVFE" before release 3.7There are multiple similar warnings: This file is deprecated and will be removed after 3.7. Use NavierStokesMomentumCVFE type tag.
`Building CXX object test/freeflow/navierstokes/donea/CMakeFiles/test_ff_stokes_donea_momentum.dir/main_m...There are multiple similar warnings: This file is deprecated and will be removed after 3.7. Use NavierStokesMomentumCVFE type tag.
`Building CXX object test/freeflow/navierstokes/donea/CMakeFiles/test_ff_stokes_donea_momentum.dir/main_momentum.cc.o
In file included from /dumux/test/freeflow/navierstokes/donea/properties_momentum.hh:54,
from /dumux/test/freeflow/navierstokes/donea/main_momentum.cc:58:
/dumux/dumux/freeflow/navierstokes/momentum/diamond/model.hh:48:2: warning: #warning "This file is deprecated and will be removed after 3.7. Use NavierStokesMomentumCVFE type tag." [-Wcpp]
48 | #warning "This file is deprecated and will be removed after 3.7. Use NavierStokesMomentumCVFE type tag."
| ^~~~~~~
In file included from /dumux/test/freeflow/navierstokes/donea/properties_momentum.hh:57,
from /dumux/test/freeflow/navierstokes/donea/main_momentum.cc:58:
/dumux/dumux/freeflow/navierstokes/momentum/pq1bubble/model.hh:48:2: warning: #warning "This file is deprecated and will be removed after 3.7. Use NavierStokesMomentumCVFE type tag." [-Wcpp]
48 | #warning "This file is deprecated and will be removed after 3.7. Use NavierStokesMomentumCVFE type tag."
| ^~~~~~~
`3.7Stefanie KiemleStefanie Kiemlehttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1230Remove deprecation warnings related to "conversion of multitype matrices" bef...2023-03-20T21:05:54ZHamza OukiliRemove deprecation warnings related to "conversion of multitype matrices" before release 3.7There are multiple similar warnings : After 3.7 Newton will no longer support conversion of multitype matrices for solvers that don't support this feature
`In file included from /dumux/dumux/multidomain/newtonsolver.hh:29,
...There are multiple similar warnings : After 3.7 Newton will no longer support conversion of multitype matrices for solvers that don't support this feature
`In file included from /dumux/dumux/multidomain/newtonsolver.hh:29,
from /dumux/test/freeflow/navierstokes/angeli/main.cc:48:
/dumux/dumux/nonlinear/newtonsolver.hh: In instantiation of ‘bool Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::solveLinearSystem_(Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::ResidualVector&) [with Assembler = Dumux::MultiDomainFVAssembler<Dumux::MultiDomainTraits<Dumux::Properties::TTag::AngeliTestMomentum, Dumux::Properties::TTag::AngeliTestMass>, Dumux::FCStaggeredFreeFlowCouplingManager<Dumux::MultiDomainTraits<Dumux::Properties::TTag::AngeliTestMomentum, Dumux::Properties::TTag::AngeliTestMass> >, Dumux::DiffMethod::numeric>; LinearSolver = Dumux::UMFPackBackend; Reassembler = Dumux::DefaultPartialReassembler; Comm = Dune::Communication<ompi_communicator_t*>; Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::ResidualVector = Dune::MultiTypeBlockVector<Dune::BlockVector<Dune::FieldVector<double, 1>, std::allocator<Dune::FieldVector<double, 1> > >, Dune::BlockVector<Dune::FieldVector<double, 1>, std::allocator<Dune::FieldVector<double, 1> > > >]’:
/dumux/dumux/nonlinear/newtonsolver.hh:518:25: required from ‘void Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::solveLinearSystem(Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::ResidualVector&) [with Assembler = Dumux::MultiDomainFVAssembler<Dumux::MultiDomainTraits<Dumux::Properties::TTag::AngeliTestMomentum, Dumux::Properties::TTag::AngeliTestMass>, Dumux::FCStaggeredFreeFlowCouplingManager<Dumux::MultiDomainTraits<Dumux::Properties::TTag::AngeliTestMomentum, Dumux::Properties::TTag::AngeliTestMass> >, Dumux::DiffMethod::numeric>; LinearSolver = Dumux::UMFPackBackend; Reassembler = Dumux::DefaultPartialReassembler; Comm = Dune::Communication<ompi_communicator_t*>; Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::ResidualVector = Dune::MultiTypeBlockVector<Dune::BlockVector<Dune::FieldVector<double, 1>, std::allocator<Dune::FieldVector<double, 1> > >, Dune::BlockVector<Dune::FieldVector<double, 1>, std::allocator<Dune::FieldVector<double, 1> > > >]’
/dumux/dumux/nonlinear/newtonsolver.hh:982:17: required from ‘bool Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::solve_(typename Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::ParentType::Variables&) [with Assembler = Dumux::MultiDomainFVAssembler<Dumux::MultiDomainTraits<Dumux::Properties::TTag::AngeliTestMomentum, Dumux::Properties::TTag::AngeliTestMass>, Dumux::FCStaggeredFreeFlowCouplingManager<Dumux::MultiDomainTraits<Dumux::Properties::TTag::AngeliTestMomentum, Dumux::Properties::TTag::AngeliTestMass> >, Dumux::DiffMethod::numeric>; LinearSolver = Dumux::UMFPackBackend; Reassembler = Dumux::DefaultPartialReassembler; Comm = Dune::Communication<ompi_communicator_t*>; typename Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::ParentType::Variables = Dune::MultiTypeBlockVector<Dune::BlockVector<Dune::FieldVector<double, 1>, std::allocator<Dune::FieldVector<double, 1> > >, Dune::BlockVector<Dune::FieldVector<double, 1>, std::allocator<Dune::FieldVector<double, 1> > > >; Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::ParentType = Dumux::PDESolver<Dumux::MultiDomainFVAssembler<Dumux::MultiDomainTraits<Dumux::Properties::TTag::AngeliTestMomentum, Dumux::Properties::TTag::AngeliTestMass>, Dumux::FCStaggeredFreeFlowCouplingManager<Dumux::MultiDomainTraits<Dumux::Properties::TTag::AngeliTestMomentum, Dumux::Properties::TTag::AngeliTestMass> >, Dumux::DiffMethod::numeric>, Dumux::UMFPackBackend>]’
/dumux/dumux/nonlinear/newtonsolver.hh:351:36: required from ‘void Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::solve(typename Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::ParentType::Variables&, Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::TimeLoop&) [with Assembler = Dumux::MultiDomainFVAssembler<Dumux::MultiDomainTraits<Dumux::Properties::TTag::AngeliTestMomentum, Dumux::Properties::TTag::AngeliTestMass>, Dumux::FCStaggeredFreeFlowCouplingManager<Dumux::MultiDomainTraits<Dumux::Properties::TTag::AngeliTestMomentum, Dumux::Properties::TTag::AngeliTestMass> >, Dumux::DiffMethod::numeric>; LinearSolver = Dumux::UMFPackBackend; Reassembler = Dumux::DefaultPartialReassembler; Comm = Dune::Communication<ompi_communicator_t*>; typename Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::ParentType::Variables = Dune::MultiTypeBlockVector<Dune::BlockVector<Dune::FieldVector<double, 1>, std::allocator<Dune::FieldVector<double, 1> > >, Dune::BlockVector<Dune::FieldVector<double, 1>, std::allocator<Dune::FieldVector<double, 1> > > >; Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::ParentType = Dumux::PDESolver<Dumux::MultiDomainFVAssembler<Dumux::MultiDomainTraits<Dumux::Properties::TTag::AngeliTestMomentum, Dumux::Properties::TTag::AngeliTestMass>, Dumux::FCStaggeredFreeFlowCouplingManager<Dumux::MultiDomainTraits<Dumux::Properties::TTag::AngeliTestMomentum, Dumux::Properties::TTag::AngeliTestMass> >, Dumux::DiffMethod::numeric>, Dumux::UMFPackBackend>; Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::TimeLoop = Dumux::TimeLoopBase<double>]’
/dumux/test/freeflow/navierstokes/angeli/main.cc:177:30: required from here
/dumux/dumux/nonlinear/newtonsolver.hh:1122:38: warning: ‘std::enable_if_t<((! Dumux::linearSolverAcceptsMultiTypeMatrix<LS>()) && Dumux::isMultiTypeBlockVector<V>()), bool> Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::solveLinearSystemImpl_(LinearSolver&, Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::JacobianMatrix&, Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::ResidualVector&, Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::ResidualVector&) [with LS = Dumux::UMFPackBackend; V = Dune::MultiTypeBlockVector<Dune::BlockVector<Dune::FieldVector<double, 1>, std::allocator<Dune::FieldVector<double, 1> > >, Dune::BlockVector<Dune::FieldVector<double, 1>, std::allocator<Dune::FieldVector<double, 1> > > >; Assembler = Dumux::MultiDomainFVAssembler<Dumux::MultiDomainTraits<Dumux::Properties::TTag::AngeliTestMomentum, Dumux::Properties::TTag::AngeliTestMass>, Dumux::FCStaggeredFreeFlowCouplingManager<Dumux::MultiDomainTraits<Dumux::Properties::TTag::AngeliTestMomentum, Dumux::Properties::TTag::AngeliTestMass> >, Dumux::DiffMethod::numeric>; LinearSolver = Dumux::UMFPackBackend; Reassembler = Dumux::DefaultPartialReassembler; Comm = Dune::Communication<ompi_communicator_t*>; std::enable_if_t<((! Dumux::linearSolverAcceptsMultiTypeMatrix<LS>()) && Dumux::isMultiTypeBlockVector<V>()), bool> = bool; Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::JacobianMatrix = Dune::MultiTypeBlockMatrix<Dune::MultiTypeBlockVector<Dune::BCRSMatrix<Dune::FieldMatrix<double, 1, 1>, std::allocator<Dune::FieldMatrix<double, 1, 1> > >, Dune::BCRSMatrix<Dune::FieldMatrix<double, 1, 1>, std::allocator<Dune::FieldMatrix<double, 1, 1> > > >, Dune::MultiTypeBlockVector<Dune::BCRSMatrix<Dune::FieldMatrix<double, 1, 1>, std::allocator<Dune::FieldMatrix<double, 1, 1> > >, Dune::BCRSMatrix<Dune::FieldMatrix<double, 1, 1>, std::allocator<Dune::FieldMatrix<double, 1, 1> > > > >; Dumux::NewtonSolver<Assembler, LinearSolver, Reassembler, Comm>::ResidualVector = Dune::MultiTypeBlockVector<Dune::BlockVector<Dune::FieldVector<double, 1>, std::allocator<Dune::FieldVector<double, 1> > >, Dune::BlockVector<Dune::FieldVector<double, 1>, std::allocator<Dune::FieldVector<double, 1> > > >]’ is deprecated: After 3.7 Newton will no longer support conversion of multitype matrices for solvers that don't support this feature! [-Wdeprecated-declarations]
1122 | return solveLinearSystemImpl_(this->linearSolver(),
| ~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~
1123 | this->assembler().jacobian(),
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
1124 | deltaU,
| ~~~~~~~
1125 | this->assembler().residual());
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/dumux/dumux/nonlinear/newtonsolver.hh:1183:5: note: declared here
1183 | solveLinearSystemImpl_(LinearSolver& ls,`3.7Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1229Building Test FreeFlow Navier Stokes Kovasznay has an error2023-02-28T18:58:52ZHamza OukiliBuilding Test FreeFlow Navier Stokes Kovasznay has an error<!--
This form is for bug reports ONLY!
If you're looking for help check out the [readme](/README.md).
-->
**Bug report**
**What happened / Problem description**:
The CI was passing. Build and test with no errors
When building Tes...<!--
This form is for bug reports ONLY!
If you're looking for help check out the [readme](/README.md).
-->
**Bug report**
**What happened / Problem description**:
The CI was passing. Build and test with no errors
When building Test FreeFlow Navier Stokes Kovasznay
`/dumux/test/freeflow/navierstokes/kovasznay/main.cc:159:26: required from here
/dumux/dumux/linear/istlsolvers.hh:205:58: error: no matching function for call to ‘Dumux::ParallelVectorHelper<Dune::GridView<Dune::DefaultLeafGridViewTraits<const Dune::YaspGrid<2, Dune::EquidistantOffsetCoordinates<double, 2> > > >, Dune::MultipleCodimMultipleGeomTypeMapper<Dune::GridView<Dune::DefaultLeafGridViewTraits<const Dune::YaspGrid<2, Dune::EquidistantOffsetCoordinates<double, 2> > > > >, 0>::makeNonOverlappingConsistent(Dune::MultiTypeBlockVector<Dune::BlockVector<Dune::FieldVector<double, 1>, std::allocator<Dune::FieldVector<double, 1> > >, Dune::BlockVector<Dune::FieldVector<double, 1>, std::allocator<Dune::FieldVector<double, 1> > > >&)’
205 | vectorHelper.makeNonOverlappingConsistent(y);
`
**What you expected to happen**:
No error and build the test
**How to reproduce it (as minimally and precisely as possible)**:
Build Test FreeFlow Navier Stokes Kovasznay with MPI
**Anything else we need to know?**:
When running locally with the MPI as False the test build
**Environment**:
- Dune version: master
- DuMux version: master
- Others:3.7https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1228Add unit tests for solvers/preconditioners2023-02-24T19:21:41ZTimo Kochtimokoch@math.uio.noAdd unit tests for solvers/preconditionersWe have some solver and preconditioners and it would be nice to have unit tests. Essentially we can just use the dune unit tests and run them for the additional solvers/preconditioners in dumux. (I don't mean the IstlLinearSolvers but so...We have some solver and preconditioners and it would be nice to have unit tests. Essentially we can just use the dune unit tests and run them for the additional solvers/preconditioners in dumux. (I don't mean the IstlLinearSolvers but solver components, e.g. in `dumux/linear/preconditioners.hh`.) But it also doesn't hurt to add unit tests for all solvers. A good idea could be (as in dune) to solve a simple laplace problem.https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1227[fix] MultiDomainNewtonConvergenceWriter needs to be adapted to use residualV...2023-03-14T20:16:13ZAnna Mareike Kostelecky[fix] MultiDomainNewtonConvergenceWriter needs to be adapted to use residualVectorIn merge request !3385 (related commit caba229b), the type `SolutionVector` was distinguished from the type `ResidualVector`.
This change seems to have been forgotten for the `MultiDomainNewtonConvergenceWriter'. This should be fixed.In merge request !3385 (related commit caba229b), the type `SolutionVector` was distinguished from the type `ResidualVector`.
This change seems to have been forgotten for the `MultiDomainNewtonConvergenceWriter'. This should be fixed.3.7Anna Mareike KosteleckyAnna Mareike Kosteleckyhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1226[ci] test with dune 2.92023-02-23T14:58:02ZTimo Kochtimokoch@math.uio.no[ci] test with dune 2.9The release will only support dune 2.9 and not work with dune 2.8 anymore.
This also allows to simplify some Python stuff #1225The release will only support dune 2.9 and not work with dune 2.8 anymore.
This also allows to simplify some Python stuff #12253.7Hamza OukiliHamza Oukilihttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1225[python][cmake] Simplify Python CMake code with dune 2.92023-03-09T13:26:46ZTimo Kochtimokoch@math.uio.no[python][cmake] Simplify Python CMake code with dune 2.9Also remove documentation for setup with dune 2.8Also remove documentation for setup with dune 2.83.7Mathis KelmMathis Kelmhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1224Warning with Dune2.9 and Dune master2023-02-23T17:12:19ZHamza OukiliWarning with Dune2.9 and Dune masterWhen building dumux tests that call YaspGrid, I have the warnings below. Does anyone know if they are related to dumux or completely within dune ?
```
In file included
from /dumux/dune-grid/dune/grid/yaspgrid/torus.hh:...When building dumux tests that call YaspGrid, I have the warnings below. Does anyone know if they are related to dumux or completely within dune ?
```
In file included
from /dumux/dune-grid/dune/grid/yaspgrid/torus.hh:23
from /dumux/dune-grid/dune/grid/yaspgrid.hh:69
from /dumux/dumux/test/porousmediumflow/1p/isothermal/properties.hh:29
from /dumux/dumux/test/porousmediumflow/1p/isothermal/main.cc:35:
/dumux/dune-grid/dune/grid/yaspgrid/partitioning.hh: In instantiation of ‘Dune::YLoadBalance<d>::~YLoadBalance() [with int d = 2]’:
/dumux/dune-grid/dune/grid/yaspgrid/partitioning.hh:195:37: required from ‘Dune::YLoadBalanceForward<d>::~YLoadBalanceForward() [with int d = 2]’
/dumux/dune-grid/dune/grid/yaspgrid/partitioning.hh:232:9: required from ‘void Dumux::GridManager<Dune::YaspGrid<dim, Coordinates> >::init(const GlobalPosition&, const std::array<int, dim>&, const string&, int, std::bitset<dim>) [with Coordinates = Dune::EquidistantCoordinates<double, 2>; int dim = 2; Dumux::GridManager<Dune::YaspGrid<dim, Coordinates> >::GlobalPosition = Dune::FieldVector<double, 2>; std::string = std::__cxx11::basic_string<char>]’
/dumux/dumux/dumux/io/grid/gridmanager_yasp.hh:100:21: required from ‘void Dumux::GridManager<Dune::YaspGrid<dim, Coordinates> >::init(const string&) [with Coordinates = Dune::EquidistantCoordinates<double, 2>; int dim = 2; std::string = std::__cxx11::basic_string<char>]’
/dumux/dumux/test/porousmediumflow/1p/isothermal/main.cc:74:21: required from here
/dumux/dune-grid/dune/grid/yaspgrid/partitioning.hh:180:30: warning: ‘Dune::YLoadBalance<d>::~YLoadBalance() [with int d = 2]’ is deprecated: use the new interface of Yasp::Partitioning [-Wdeprecated-declarations]
180 | virtual ~YLoadBalance() {}
| ^
/dumux/dune-grid/dune/grid/yaspgrid/partitioning.hh:180:13: note: declared here
180 | virtual ~YLoadBalance() {}
| ^
```Hamza OukiliHamza Oukilihttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1223Remove solver deprecation warnings2023-03-20T21:05:52ZTimo Kochtimokoch@math.uio.noRemove solver deprecation warnings* [ ] After merging !3386 there are deprecation warnings concerning the solvers for many test. The solvers should be updated to use the new IstlSolverBackend.* [ ] After merging !3386 there are deprecation warnings concerning the solvers for many test. The solvers should be updated to use the new IstlSolverBackend.3.7Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1222In DuneVectorType add a check for blockLevel requirements or implement recurs...2023-02-22T11:18:02ZTimo Kochtimokoch@math.uio.noIn DuneVectorType add a check for blockLevel requirements or implement recursivelyThe following discussion from !3385 should be addressed:
- [ ] @DennisGlaeser started a [discussion](https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/merge_requests/3385#note_84501): (+2 comments)
> I think it could be ...The following discussion from !3385 should be addressed:
- [ ] @DennisGlaeser started a [discussion](https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/merge_requests/3385#note_84501): (+2 comments)
> I think it could be good to statically check that the vector has a depth of 2 here, since this seems to be the case? Otherwise one gets a surprising result for vectors of different depth (although unlikely, and it would probably not compile due to "misuse" of the resulting vector type in other places)...
We could either add a static_assert or try to do the same recursion as for multitypevector to recursively use the template to convert block types.https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1221Anisotropic permeability Law2024-03-25T10:38:56ZJohannes HommelAnisotropic permeability Law**What does this feature / why does DuMux need it**:
An anisotropic permeability law is needed to account for precipitation having a different impact on the permeability in different directions. This is necessary for modeling a developi...**What does this feature / why does DuMux need it**:
An anisotropic permeability law is needed to account for precipitation having a different impact on the permeability in different directions. This is necessary for modeling a developing anisotropy due to precipitation as observed in the microfluidic experiments of the CRC1313, Project C04 by Felix Weinhardt.
My goal would be to have a permeability Law that used different exponents in a power law for each of the directions:
kxxFactor = (poro/refPoro)^exponentX
kyyFactor = (poro/refPoro)^exponentY
K = kxxFactor * Kxx_0 0
0 kyyFactor * Kyy_0
I guess the easiest would be to have a matrix multiplication of the initial permeability K_0 with a "factor matrix" F:
K = K_0 * F
with
K_0 = Kxx_0 0
0 Kyy_0
and
F = kxxFactor 0
0 kyyFactor
**Which issue does this feature fix (if any)**
This issue/feature does not fix any other open issues.
**Anything else we need to know?**:
I would like to discuss whether there is an elegant, general solution for implementing anisotropic permeability laws recycling the current permeability laws (assuming isotropic permeability change) or whether I should just implement a new permeability law for my specific case.
Also, my question would be how to do this potentially even more generalized for 2D and 3D in one permeability law, if possible.3.9Johannes HommelJohannes Hommelhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1220PDESolver reuse matrix doesn't work in parallel2023-03-20T21:05:52ZTimo Kochtimokoch@math.uio.noPDESolver reuse matrix doesn't work in parallelCan be tested with !3380 by adding
```
assembler.assembleJacobianAndResidual(sol);
solver.reuseMatrix(true);
```
I suspect that the parallel solver adapts the matrix every time and this operation is probably additive.
I this case, we w...Can be tested with !3380 by adding
```
assembler.assembleJacobianAndResidual(sol);
solver.reuseMatrix(true);
```
I suspect that the parallel solver adapts the matrix every time and this operation is probably additive.
I this case, we would need to let the solver know that the matrix is not to be changed because it's already been prepared for a parallel solve.3.7Timo Kochtimokoch@math.uio.noTimo Kochtimokoch@math.uio.nohttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1218Code quality report Gitlab+cppcheck2023-02-10T16:26:51ZTimo Kochtimokoch@math.uio.noCode quality report Gitlab+cppcheckUsing this tool https://gitlab.com/ahogen/cppcheck-codequality
we might be able to integrate the cppcheck report with GitlabUsing this tool https://gitlab.com/ahogen/cppcheck-codequality
we might be able to integrate the cppcheck report with Gitlabhttps://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1217Parallel computing for Navier-Stokes2023-02-28T15:08:26ZMojtaba BarzegariParallel computing for Navier-StokesHi,
I have difficulty finding proper examples of parallel execution of coupled Navier-Stokes problems (like https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/tree/master/test/freeflow/navierstokes/channel/3d_nonuniform). When ...Hi,
I have difficulty finding proper examples of parallel execution of coupled Navier-Stokes problems (like https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/tree/master/test/freeflow/navierstokes/channel/3d_nonuniform). When I apply the typical workflow for enabling parallel run (as described in the handbook, by including `<dumux/linear/amgbackend.hh>` and replacing the sequential solver with `using LinearSolver = AMGBiCGSTABBackend<LinearSolverTraits<GridGeometry>>; auto linearSolver = std::make_shared<LinearSolver>(leafGridView, gridGeometry->dofMapper());`), I face a couple of errors similar to this:
`/dune-istl/dune/istl/novlpschwarz.hh:80:42: error: no type named ‘ConstColIterator’ in ‘class Dune::MultiTypeBlockMatrix<Dune::MultiTypeBlockVector<Dune::BCRSMatrix<Dune::FieldMatrix<double, 3, 3>, std::allocator<Dune::FieldMatrix<double, 3, 3> > >, Dune::BCRSMatrix<Dune::FieldMatrix<double, 3, 1>, std::allocator<Dune::FieldMatrix<double, 3, 1> > > >, Dune::MultiTypeBlockVector<Dune::BCRSMatrix<Dune::FieldMatrix<double, 1, 3>, std::allocator<Dune::FieldMatrix<double, 1, 3> > >, Dune::BCRSMatrix<Dune::FieldMatrix<double, 1, 1>, std::allocator<Dune::FieldMatrix<double, 1, 1> > > > >’
`
I checked it with both `MomentumGridGeometry` and `MassGridGeometry`, both resulting in the same error. I see that there are quite a few examples and tests for `AMGBiCGSTABBackend` for Stokes free flow and flow in porous media, but I couldn't find any relevant thing for NS where mass and momentum equations are coupled. Is parallel AMG backend not supported for such coupled problems?
I'm checking all these things with DuMux 3.6.https://git.iws.uni-stuttgart.de/dumux-repositories/dumux/-/issues/1216[frictionlaws] Roughnessheight calculation in friction laws2023-03-09T14:27:11ZLeopold Stadler[frictionlaws] Roughnessheight calculation in friction laws**Problem**
Currently, the calculated shear stress of friction laws is limited to avoid unphysical high values for small water depths. Therefore a roughness height (`roughnessHeight`) is calculated/defined inside the friction law to est...**Problem**
Currently, the calculated shear stress of friction laws is limited to avoid unphysical high values for small water depths. Therefore a roughness height (`roughnessHeight`) is calculated/defined inside the friction law to estimate the lower and upper height for the limitation. In applications with extreme small water depths this may lead to unwanted effects.
**Solution A**
The user should define the `roughnessHeight` (default = 0.0) and a factor (default = 2.0) to compute a height beneath the limiting should be applied. Both values will be optional arguments of the friction law. The user will be able to avoid the limiting by setting `roughnessHeight` to zero. This change will force users to define the limiting for cases with small water depths to ensure good convergence properties. Note, that an unphysical high shear stress can change the flow direction.
**Solution B**
Add unlimited versions of Nikuradse and Manning (e.g unlimitedNikuradse and unlimitedManning), deprecate the existing Manning and Nikuradse and rename them as limitedManning and limitedNikuradse. Further improve the documentation of the limited versions. The actual version with the estimation of a roughnessHeight and the applied limiting is plausible and gives good results.Leopold StadlerLeopold Stadler