Usage#

As with any other simulation software, coming from a question to an answer requires

  1. Pre-Processing

  2. Running the Simulation

  3. Post-Processing

Since the materialpoint model of DAMASK can be combined with a number of different solvers for initial and boundary value problems, a further initial step is necessary:

  1. Solver Selection

0. Solver Selection#

The below table compares the capabilities and specifics of the three available solvers to help select the most suitable one for your problem at hand.

Feature

Grid Solver

Mesh Solver

MSC Marc

included in DAMASK

open source

solution method

spectral (FFT) of FEM

FEM

FEM

geometry

regular grid

unstructured mesh

unstructured mesh

boundary conditions

mixed periodic

tbd

complex

Warning

The mesh solver is under development. It lacks features and has convergence issues.

1. Pre-Processing#

DAMASK Materialpoint Model#

The materialpoint model of DAMASK is configured through a configuration file in YAML format. Its format is documented in Materialpoint Configuration. Additional information is given in the video tutorial “Configure a DAMASK Simulation” and the Jupyter notebook on how to “Create Texture for a Dual-Phase Microstructure”.

A set of curated configuration files is available as config.tar.xz.

Geometry and Load#

The procedure for generating the geometry and setting boundary and initial conditions depends on the selected solver.

Grid Solver#

A grid solver simulation is set up with a Geometry and a Load Case file. The video tutorials “Define a Grain Structure on a Regular Grid” and “Boundary Conditions on an Infinite Body” explain the underlying ideas of these file formats. Examples are given in the Jupyter notebooks “Generate a Three-Step Load Case for the Grid Solver”, “Create a Polycrystal with Voronoi Tessellation for the Grid Solver”, and “Rescale a Grid Solver Geometry File”.

A complete simulation setup is available as grid.tar.xz.

Mesh Solver#

A mesh solver simulation is set up with a Geometry and a Load Case file.

A complete simulation setup is available as mesh.tar.xz.

MSC Marc#

An MSC Marc input deck (*.dat) can be generated manually using a text editor or with Marc Mentat, where *.proc files can be used for automatization. DAMASK is interfaced to MSC Marc through a hypela2 user subroutine. The link between the geometry in the input deck and the material ID in material.yaml is provided via the StateVariable 2 field.

Note

Material IDs in DAMASK are zero-based.

A complete simulation setup is available as Marc.tar.xz.

2. Running the Simulation#

In general, each solver is an executable file and can, therefore, be directly invoked from the command line prompt. The specifics of how to start a simulation, such as command line arguments or how to parallelize, depend on the selected solver.

Grid Solver#

DAMASK_grid --load {load}.yaml --geom {grid}.vti --material {material}.yaml

Note

A pair of curly brackets is a placeholder for a file name.

Mandatory flags#

Launching the grid solver requires three mandatory flags:

Optional flags#

To further adjust the simulation, the following optional flags can be used:

  • --numerics {numerics}.yaml (short: -n) configures the algorithms used in DAMASK as outlined in Numerics Configuration.

  • --jobname {jobname} (short: -j) sets the job name, which serves as the name of the Result (DADF5) file. The default jobname is {grid}_{load}_{material}.

  • --workingdirectoy {directory} (short: -w) sets the working directory of the simulation. Any relative paths are interpreted as relative to the working directory. The default working directory is the current directory.

  • --restart {jobname}_restart.hdf5 (short: -r) restarts a simulation from a snapshot file. How frequently (if at all) snapshots are written is determined in the Load Case file.

  • --help (short: -h) reports the current version of the grid solver and provides usage instructions.

Parallelization#

To parallelize the grid solver with n_threads openMP threads, the environment variable OMP_NUM_THREADS needs to be set accordingly.

OMP_NUM_THREADS={n_threads}
DAMASK_grid --load {load}.yaml --geom {grid}.vti --material {material}.yaml

Parallelization over n_proc MPI processes can be accomplished with

mpiexec -n {n_proc} DAMASK_grid --load {load}.yaml --geom {grid}.vti --material {material}.yaml

and will decompose the simulation domain along the z-axis into n_proc layers of approximately equal height.

Note

MPI and openMP parallelization can be used concurrently.

Mesh Solver#

DAMASK_mesh --load {load}.yaml --geom {mesh}.msh --material {material}.yaml

Note

A pair of curly brackets is a placeholder for a file name.

Mandatory flags#

Launching the mesh solver requires three mandatory flags:

Optional flags#

To further adjust the simulation, the following optional flags can be used:

  • --numerics {numerics}.yaml (short: -n) configures the algorithms used in DAMASK as outlined in Numerics Configuration.

  • --jobname {jobname} (short: -j) sets the job name, which serves as the name of the Result (DADF5) file. The default jobname is {mesh}_{load}_{material}.

  • --workingdirectoy {directory} (short: -w) sets the working directory of the simulation. Any relative paths are interpreted as relative to the working directory. The default working directory is the current directory.

  • --help (short: -h) reports the current version of the grid solver and provides usage instructions.

Parallelization#

To parallelize the mesh solver with n_threads openMP threads, the environment variable OMP_NUM_THREADS needs to be set accordingly.

OMP_NUM_THREADS={n_threads}
DAMASK_mesh --load {load}.yaml --geom {mesh}.msh --material {material}.yaml

Parallelization over n_proc MPI processes can be accomplished with

mpiexec -n {n_proc} DAMASK_mesh --load {load}.yaml --geom {mesh}.msh --material {material}.yaml

and will decompose the simulation domain into partitions of approximately equal number of elements.

Note

MPI and openMP parallelization can be used concurrently.

MSC Marc#

DAMASK is integrated into the commercial FEM software MSC Marc as a user subroutine hypela2 via DAMASK_Marc.f90.

Simulations can be started from the JOBDAMASK menu that gets integrated into Marc Mentat during installation.

Alternatively, the DAMASK Python library contains a small wrapper that assembles the execution call and launches the MSC Marc executable.

import damask
s = damask.solver.Marc()
s.submit_job(model=modelname, job=jobname)

The mandatory material configuration needs to be available in the current working directory as material.yaml. The optional numerics configuration needs to be available in the current working directory as numerics.yaml.

3. Post-Processing#

DAMASK results are stored in an HDF5-based file format. Usage examples are given in the video tutorials “Get a Custom View and Add Derived Quantities”, “Spatially-Resolved Visualization in Paraview”, and “Data Analysis: Using Matplotlib and Pandas” and Jupyter notebooks “Add Derived Field Data”, “Density Plot with Pandas”, “Plot Data per Grain with Scatter”, and “Calculate r-Value”.