Clean incoporation of new time integration methods
@timok, @bernd, @kweis, @martins and I are currently discussing/developing the incorporation of a generic time integration framework in Dumux. In general, time handling is currently problematic in Dumux and leads to a bug in MultiDomain (#792, #619).
The following work plan is currently envisioned to incorporate the features into an
Experimental namespace while guaranteeing that the current features and tests on master still work:
introduce new grid variables concept, where they represent the complete state of a simulation - thus, not only secondary but also primary variables and possibly a time level. (see !2285 (merged))
- introduce new assembly concept
PDESolveraccept both assemblers that assemble around given
SolutionVectorsor more generic
Variables(see !2291 (merged))
add time step methods (see !2296 (merged))
add generic version of
FVAssembler, which assembles around
Variablesand uses the time integration methods (see !2519)
Introduce solution state (name is to be discussed) class that substitutes elementSolution during the assembly - that is, as argument to volume variables updates and in spatial parameters interfaces. The concept of this state class is to carry time information in addition to the element solution, that can be used within user interfaces. (!2520)
Introduce context class, that wraps the local views after bind in order to pass that into the user interfaces. That reduces the number of arguments in a bunch of interfaces, and moreover, we usually have interfaces like
function(element, fvGeometry, elemVolVars,...), but
fvGeometrymakes little sense if not bound to
elementanyway and it also carries the bound element. With the same argument
elemVolVarsare basically unusable if you don't have the
scvsat hand to access the corresponding volume variables. So in all those interfaces it makes sense to group the arguments. This is also introduced together with the solution state in !2520
Extend problem/parameter/volvars interfaces to make it possible to inject some container with additionally required data, which in
MultiDomaincould be used to pass the coupling data. This is probably a lot of work and requires some thought regarding compatibility.
port the above concepts to
MultiDomain(first goal: make
test_el2pwork, fixing the main bug)
Edit 25.03: We may postpone the introduction of the additional container to hold the coupling context for now and first realize multidomain in the new experimental framework but still with the context stored centrally in
CouplingManager (an outdated but working draft is in !2448). This way, we can still reuse most interfaces in non-experimental namespace. Afterwards, we could address the issue of the central context separately, which would probably involve quite some interface changes... With either approach, the bug of non-converging Newton solver for poromechanics is addressed. Getting the context out of
CouplingManager would additionally address thread safety in thread-parallel runs.
Problems that might need to be solved:
assemble()functions in the assembler now receive non-const GridVariables (introduced in 3d706804). That was necessary because in the case of global caching we actually deflect the variables that come in. We should maybe think of a concept to circumvent this, and adapt !2519 accordingly.
Intermediate solutions/developments or related stuff, which should be deleted in case we favour the propositions above:
Things that should be revisited and adapted once this is ready: