The MAST source code comes with a large number of demonstration problems for conduction, structural, thermoelastic, fluid analysis, fluid-structure interaction analysis and optimization problems. A detailed listing of these example problems is provided here.

# Running examples

Following compilation of the `example_driver`

executable these
examples are run from the command line by specifying a `run_case`

argument to the executable. Running this executable
without any options lists all the available examples.

## Simple execution

A typical execution is:

```
$ mpirun -np <N> example_driver run_case=plate_bending -ksp_type preonly -pc_type lu
```

Here, `<N>`

is replaced with the number of parallel MPI processes to be
launched for this run case.

MAST uses the command-line options to initialize itself and libMesh,
which further passes the options to SLEPc and
PETSc. Therefore, configuration of eigensolver, nonlinear and linear
solvers can be completely chosen through command-line options. In the
example above, a direct LU-decomposition solver is specified for all
linear solves. It is noted that the direct solver implementation in
PETSc only supports a single-processor run. However, with interfaces
to packages like MUMPS and SuperLU_dist, PETSc is able to provide
support for distributed-memory direct solvers. If PETSc is configured
with MUMPS, then it will automatically use this solver for `N>1`

in
the above example. Additional solver options can be found in the PETSc
user manual

## Complex solver setup

For multiphysics solutions, solver configuration can be specified for
each discipline by using the option `--solver_system_names`

which instructs MAST and libMesh to prepend the name of
`libMesh::System`

to each solver object. Then, PETSc and SLEPc
associate command-line options with a solver only if the solver name
is included in the option. For example, the following execution
command specifies different configurations for the structural
(`structural_`

) and fluid (`fluid_complex_`

) solvers:

```
$ mpirun -np 20 example_driver --solver_system_names
run_case=beam_fsi_flutter_analysis
-structural_st_pc_factor_mat_solver_package mumps
-structural_st_ksp_type preonly
-fluid_complex_ksp_type gmres -fluid_complex_pc_type asm
-fluid_complex_pc_asm_overlap 1
-fluid_complex_ksp_sub_pc_type ilu
-fluid_complex_sub_pc_factor_levels 2
-fluid_complex_ksp_gmres_restart 100 -fluid_complex_ksp_atol
1.e-4 -fluid_complex_ksp_rtol 1.e-8
```

This will launch the two-dimensional panel flutter example on 20 CPUs with a direct LU-decomposition solver for solution of the structural eigenproblem, and a GMRES solver with a level-1 overlap additive Schwarz method (ASM) preconditioner for solution of the complex fluid system-of-equations while using an ILU(2) preconditioner for local solves. Absolute and relative convergence tolerances are also specified to be \(10^{-4}\) and \(10^{-8}\), respectively and a GMRES restart after 100 iterations is requested.