Skip to content

GitLab

  • Menu
Projects Groups Snippets
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
  • dumux dumux
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 91
    • Issues 91
    • List
    • Boards
    • Service Desk
    • Milestones
  • Merge requests 82
    • Merge requests 82
  • Deployments
    • Deployments
    • Releases
  • Packages & Registries
    • Packages & Registries
    • Container Registry
  • Monitor
    • Monitor
    • Incidents
  • Analytics
    • Analytics
    • Value stream
    • Repository
  • Wiki
    • Wiki
  • Activity
  • Graph
  • Create a new issue
  • Commits
  • Issue Boards
Collapse sidebar
  • dumux-repositories
  • dumuxdumux
  • Issues
  • #373
Closed
Open
Created Mar 28, 2017 by Beatrix Becker@beckerContributor

Tricky tests

Some of our tests fail depending e.g. on the machine or the solvers that are used. For those tests, small changes seem to make a large difference. This could be a bug in a test, could also be a very sensitive system. For the latter more stable tests need to be defined. A list of the tests that failed on my computer (but did not fail on the buildbot) is below. Please add additional tests that seem to show such a behavior in the comments.

  • test_zeroeq (works with SuperLU, not with Umfpack)
  • test_el1p2c
  • test_boxadaptive2p (works with SuperLU, not with Umfpack)
  • test_2cstokes2p2c
  • test_2cnistokes2p2cni
  • test_2cnistokes2p2cni_boundarylayer
  • test_2czeroeq2p2c
  • test_2cnizeroeq2p2cni
  • test_forchheimer2p
  • test_boxmpnckinetic
  • test_cc2pncmin
  • test_stokes
  • lens2pexercise3
  • co2plumeshapeexercise
  • fuelcell
  • naplinfiltration3p3c
Assignee
Assign to
Time tracking