Usage#

As with any other simulation software, coming from a question to an answer requires

  1. Pre-Processing

  2. Running the Simulation

  3. Post-Processing

Since the materialpoint model of DAMASK can be combined with a number of different solvers for initial and boundary value problems, a further initial step is necessary:

  1. Solver Selection

0. Solver Selection#

The below table compares the capabilities and specifics of the three available solvers to help select the most suitable one for your problem at hand.

Feature

Grid Solver

Mesh Solver

MSC Marc

included in DAMASK

open source

solution method

spectral (FFT) of FEM

FEM

FEM

geometry

regular grid

unstructured mesh

unstructured mesh

boundary conditions

mixed periodic

tbd

complex

Warning

The mesh solver is under development. It lacks features and has convergence issues.

1. Pre-Processing#

DAMASK Materialpoint Model#

The materialpoint model of DAMASK is configured through a configuration file in YAML format. Its format is documented in Materialpoint Configuration. Additional information is given in the video tutorial “Configure a DAMASK Simulation” and the Jupyter notebook on how to “Create Texture for a Dual-Phase Microstructure”.

A set of curated configuration files is available as config.tar.xz.

Geometry and Load#

The procedure for generating the geometry and setting boundary and initial conditions depends on the selected solver.

Grid Solver#

A grid solver simulation is set up with a Geometry and a Load Case file. The video tutorials “Define a Grain Structure on a Regular Grid” and “Boundary Conditions on an Infinite Body” explain the underlying ideas of these file formats. Examples are given in the Jupyter notebooks “Generate a Three-Step Load Case for the Grid Solver”, “Create a Polycrystal with Voronoi Tessellation for the Grid Solver”, and “Rescale a Grid Solver Geometry File”.

A complete simulation setup is available as grid.tar.xz.

Mesh Solver#

A mesh solver simulation is set up with a Geometry and a Load Case file.

A complete simulation setup is available as mesh.tar.xz.

MSC Marc#

An MSC Marc input deck (*.dat) can be generated manually using a text editor or with MSC Mentat, where *.proc files can be used for automatization. DAMASK is interfaced to MSC Marc through a hypela2 user subroutine. The link between the geometry in the input deck and the material ID in material.yaml is provided via the StateVariable 2 field.

Note

Material IDs in DAMASK are zero-based.

Setting up a model in MSC Mentat#
  1. Material definition

    The ‘DAMASK’ material is created via the menu Material PropertiesNewStandard.

    • Type must be set as Hypoelastic.

    • Method must be set as User Sub. Hypela2.

    • This Material must be assigned to all elements.

  2. State variables

    State variables are defined via the menu Initial ConditionsNew (State Variable)General - User-Defined State Variable.

    • The State Vaiable ID must be set to 2.

    • The box Value has to be checked and the Value set to the material index consistent with the respective entry in material.yaml.

    • Each Initial Condition created this way should be assigned to those elements that contain the material of index Value.

    Marc uses State Variable 1 for the temperature. Hence, for temperature-aware constitutive models, it must be set as well: Initial ConditionsNew (State Variable)Thermal - Element Temperature.

    • The box Temperature has to be checked and set to the desired absolute temperature (in K).

    • Each Initial Condition created this way should be assigned to those elements that have this temperature.

    Note

    All Initial Conditions have to be activated in JobsPopertiesInitial LoadsInitial Conditions.

  3. Job Settings

    A new Job is created via the menu JobsNewStructural or Thermal / Structural.

    • In the submenu Initial Load, the Initial Conditions defined above can be activated as needed.

    • In the submenu Analysis OptionsAdvanced Options,

      • the Large Strain procedure has to be set to Updated Lagrange,

      • Allow Switch To Total Lagrange should be disabled,

      • the Plasticity Procedure must be set to Multiplicative Decomposition.

    • In the submenu Run,

      • the path to the file DAMASK_Marc.f90 must be set via Fortran Source File.

      • In settings

        • For parallelization using DDM, the box Use DDM must be checked and Decompostion In Mentat and Multiple Input Files selected. The # Domains must be defined and the domains created accordingly via User Domains.

        • For crystal plasticity, Nonsymmetric is preferred in section Matrix Solver. However, DDM only works for symmetric solvers.

  4. Thermal / Structural Jobs

    For thermo-mechanically coupled simulations, DAMASK also provides the flux subroutine defining heat generation by plastic dissipation. To activate its use, a volume flux boundary condition needs to be created via Boundary ConditionsNew (Thermal)Volume Flux:

    • The box Flux must be checked.

    • The Method must be set to User Sub. Flux.

    • This Boundary Condition must be assigned to the respective elements.

    Also, in menu JobsPropertiesJob Parameters, the option Heat Generation (Plasticty) has to be activated.

    Note

    The Boundary Condition has to be activated in JobsPopertiesInitial LoadsBoundary Conditions.

    Finally, the thermal material properties have to be set via the menu Material Properties.

    In the properties of the previously defined material(s), Show Properties: Thermal must be selected. The Type must be set to Anisotropic and the thermal properties must be specified identically to the values provided in the thermal section in material.yaml.

    To include thermal expansion, Show Properties must be switched back to Structural and Thermal Expansion selected. In the popup dialog, the box Thermal Expansion must be ticked and the Isotropic Thermal Expansion Coefficient specified as given in material.yaml.

    Note

    DAMASK does not check for consistency with material.yaml, so it is the responsibility of the user to ensure that all thermal properties match.

  5. Restart

    DAMASK does allow restarting a simulation. However, the only mode supported is ‘last converged increment’.

A complete simulation setup (mechanics only) is available as Marc.tar.xz.

2. Running the Simulation#

In general, each solver is an executable file and can, therefore, be directly invoked from the command line prompt. The specifics of how to start a simulation, such as command line arguments or how to parallelize, depend on the selected solver.

Grid Solver#

DAMASK_grid --load {load}.yaml --geom {grid}.vti --material {material}.yaml

Note

A pair of curly brackets is a placeholder for a file name.

Mandatory flags#

Launching the grid solver requires three mandatory flags:

Optional flags#

To further adjust the simulation, the following optional flags can be used:

  • --numerics {numerics}.yaml (short: -n) configures the algorithms used in DAMASK as outlined in Numerics Configuration.

  • --jobname {jobname} (short: -j) sets the job name, which serves as the name of the Result (DADF5) file. The default jobname is {grid}_{load}_{material}.

  • --workingdirectoy {directory} (short: -w) sets the working directory of the simulation. Any relative paths are interpreted as relative to the working directory. The default working directory is the current directory.

  • --restart N (short: -r) restarts a simulation at increment N from a snapshot file {jobname_restart.hdf5} which needs to be present in the working directory. How frequently (if at all) snapshots are written is determined in the Load Case file.

  • --help (short: -h) reports the current version of the grid solver and provides usage instructions.

Parallelization#

To parallelize the grid solver with n_threads openMP threads, the environment variable OMP_NUM_THREADS needs to be set accordingly.

OMP_NUM_THREADS={n_threads}
DAMASK_grid --load {load}.yaml --geom {grid}.vti --material {material}.yaml

Parallelization over n_proc MPI processes can be accomplished with

mpiexec -n {n_proc} DAMASK_grid --load {load}.yaml --geom {grid}.vti --material {material}.yaml

and will decompose the simulation domain along the z-axis into n_proc layers of approximately equal height.

Note

MPI and openMP parallelization can be used concurrently.

Mesh Solver#

DAMASK_mesh --load {load}.yaml --geom {mesh}.msh --material {material}.yaml

Note

A pair of curly brackets is a placeholder for a file name.

Mandatory flags#

Launching the mesh solver requires three mandatory flags:

Optional flags#

To further adjust the simulation, the following optional flags can be used:

  • --numerics {numerics}.yaml (short: -n) configures the algorithms used in DAMASK as outlined in Numerics Configuration.

  • --jobname {jobname} (short: -j) sets the job name, which serves as the name of the Result (DADF5) file. The default jobname is {mesh}_{load}_{material}.

  • --workingdirectoy {directory} (short: -w) sets the working directory of the simulation. Any relative paths are interpreted as relative to the working directory. The default working directory is the current directory.

  • --help (short: -h) reports the current version of the grid solver and provides usage instructions.

Parallelization#

To parallelize the mesh solver with n_threads openMP threads, the environment variable OMP_NUM_THREADS needs to be set accordingly.

OMP_NUM_THREADS={n_threads}
DAMASK_mesh --load {load}.yaml --geom {mesh}.msh --material {material}.yaml

Parallelization over n_proc MPI processes can be accomplished with

mpiexec -n {n_proc} DAMASK_mesh --load {load}.yaml --geom {mesh}.msh --material {material}.yaml

and will decompose the simulation domain into partitions of approximately equal number of elements.

Note

MPI and openMP parallelization can be used concurrently.

MSC Marc#

DAMASK is integrated into the commercial FEM software MSC Marc as a user subroutine hypela2 via DAMASK_Marc.f90.

Simulations can be started from the JOBRunDamask menu that gets integrated into MSC Mentat during installation.

The mandatory material configuration needs to be available in the current working directory as material.yaml. The optional numerics configuration needs to be available in the current working directory as numerics.yaml.

Alternatively, the DAMASK Python library contains a small wrapper that assembles the execution call and launches MSC Marc.

import damask
s = damask.solver.Marc()
s.submit_job(model=modelname, job=jobname, domains=num_domains)

In this case the MSC Marc input file(s) also need to be available in the current working directory.

3. Post-Processing#

DAMASK results are stored in an HDF5-based file format. Usage examples are given in the video tutorials “Get a Custom View and Add Derived Quantities”, “Spatially-Resolved Visualization in Paraview”, and “Data Analysis: Using Matplotlib and Pandas” and Jupyter notebooks “Add Derived Field Data”, “Density Plot with Pandas”, “Plot Data per Grain with Scatter”, and “Calculate r-Value”.