Crashes#
Why does my MPI parallelized simulation fail in the first increment?#
This might happen in case the wrong mpiexec
or mpirun
launcher is called.
The symptoms vary, either an MPI error occurs or the simulation does not converge.
To resolve, please use the launcher of the specific MPI library that was used to compile DAMASK.
Note
For a self-compiled PETSc installation, $PETSC_DIR/$PETSC_ARCH/lib/petsc/bin/petscmpiexec
is the safest option.
Why does my simulation exit with a segmentation fault or memory error during initialization?#
This is usually a result of a too small stack size, which occurs mostly when using the Intel compiler. To solve this issue, set the stack size to a larger value, e.g.
ulimit -s unlimited
Why does my simulation exit with an HDF5 error?#
This is usually caused by a file system unable to perform file locks.
The file locking of HDF5 can be disabled by setting the environment variable HDF5_USE_FILE_LOCKING
to FALSE
.
Why does my thermo-mechanical simulation exit with and segmentation fault when using the FEM grid solver?#
This is related to a bug in PETSc which has been fixed in PETSc version 3.16.4. To solve this issue, update to a newer PETSc release.