- Infos im HLRS Wiki sind nicht rechtsverbindlich und ohne Gewähr -

CRAY XC40 local Modules

From HLRS Platforms

Third party tools


The module gnu-tools collects more recent versions of basic functionalities, including the GNU building system (autoconf, automake, libtool, m4), as well as bash, cmake, gperf, git, gwak, swig, bison, and gnuplot. The actual versions can be listed using

% module whatis tools/gnu-tools

To use the actual version of bash with full support of the module environment you can simply call

% bash -l myScript.sh

or define the absolute path in the first line of your script

#!/opt/hlrs/tools/gnu-tools/generic/bin/bash -l


GNU Octave is a high-level interpreted language, primarily intended for numerical computations. It provides capabilities for the numerical solution of linear and nonlinear problems, and for performing other numerical experiments. It also provides extensive graphics capabilities for data visualization and manipulation. GNU Octave is normally used through its interactive interface (CLI and GUI), but it can also be used to write non-interactive programs. The GNU Octave language is quite similar to Matlab so that most programs are easily portable.

Octave is compiled to run on the compute nodes and can be launched e.g. in an interactive session:

% qsub -I [options]
% module load tools/octave 
% aprun -n 1 -N 1 octave octave.script


The modules cray-python/* provide basic python installations including the packages numpy, scipy, mpi4py, and dask.

More complete site-specific python installations can be found in the modules python-site/2.7 and python-site/3.6 respectively. These installations build on top of cray-python. In addition they provide python packages such as H5py.

Finally, software in other modules, as for instance tools/vtk, may include python wrappers. These become available when loading the respective modules. Note, that in some cases, python wrappers are available only for either python2 or python3, not both.

At any time, the list of currently useable python packages can be obtained with the command

pip freeze

Note that this list will change as you load and unload modules.

Also, users may install additional python packages in their own home directory using pip through an ssh-tunnel.


The SLEPc (Scalable Library for Eigenvalue Problem Computations) is an extantion of PETSc for solving linear eigenvalue problems in either standard or generalized form. Furthermore, SLEPc can compute partial SVD of a large, sparse, rectangular matrix, and solve nonlinear eigenvalue problems (polynomial or general). Additionally, SLEPc provides solvers for the computation of the action of a matrix function on a vector. To use SLEPc please load the modules:

  module load numlib/hlrs_SLEPc  


  module load cray-petsc-64/

Compile by

  CC -I/opt/hlrs/numlib/slepc/3.7.4/include -L/opt/hlrs/numlib/slepc/3.7.4/lib simple.c++ -lslepc


In order to use Git, load the corresponding module

module load tools/git

Due to the fact that internet access is restricted within HWW-systems, you have to use a ssh tunnel to access remote repositories.

Utilities for processing netcdf files

The module tools/netcdf_utils contains the follwing tools:

Third party scientific software


CP2K is a freely available (GPL) program to perform atomistic and molecular simulations of solid state, liquid, molecular and biological systems. It provides a general framework for different methods such as e.g. density functional theory (DFT) using a mixed Gaussian and plane waves approach (GPW), and classical pair and many-body potentials. It is very well and consistently written, standards-conforming Fortran 95, parallelized with MPI and in some parts with hybrid OpenMP+MPI as an option.

CP2K provides state-of-the-art methods for efficient and accurate atomistic simulations, sources are freely available and actively improved. It has an active international development team, with the unofficial head quarters in the University of Zürich.

The molecular simulation package is installed, optimized for the present architecture, compiled with gfortran using optimized versions of libxc, libint and libsmm.

 module load chem/cp2k 

provide four versions of different kind of parallelizations:

 cp2k.ssmp  - only OpenMP
 cp2k.popt  - only MPI 
 cp2k.psmp  - hybrid MPI + OpenMP
 cp2k.pdbg  - only MPI compiled with debug flags

After loading the related module (chem/cp2k), the binary can be directly called in the job submission script, e.g.:

aprun -n 24 -N 24 cp2k.psmp myCp2kInputFile.inp > myOutput.out

Some examples for CP2K input files are provided on the CP2K homepage and there also exist the input reference.


GROMACS (GROningen MAchine for Chemical Simulations) is a molecular dynamics package which can be used by

 module load chem/gromacs 


OpenFOAM (Open Field Operation and Manipulation) is an open source CFD software package. Multiple versions of OpenFOAM are available compiled with gnu and intel. Available versions can be listed using

 module avail cae/openfoam 

OpenFOAM can be used with PrgEnv-gnu and PrgEnv-intel, e.g.

$> module swap PrgEnv-cray PrgEnv-gnu
$> module load cae/openfoam

Furthermore, Foam-extend is available but only for PrgEnv-gnu

$> module swap PrgEnv-cray PrgEnv-gnu
$> module load cae/openfoam/3.0-extend

Build your own OpenFoam solver

You can build your own solver by using the provided OpenFOAM environment. There exists environment variables providing a infrastructure to compile your solver with OpenFOAM and provide links for fast access. These environment variables for building own solvers point to your home directory ($HOME/OpenFoam), e.g. $FOAM_USER_APPBIN, because of technical reasons. But please run you application on the Lustre file system.

An example how to build and use an own solver:

$> mkdir -p ~/OpenFoam/test; cd !$                                    # creates a directory for the new solver and change into it
$> cp -r $FOAM_SOLVERS/incompressible/pisoFoam testsolver; cd !$      # copies and existing solver directory and change into it
$> mv pisoFoam.C myFoam.C; vi !$                                      # rename the solver and perform your modifications
$> vi Make/files		    	                              # change file and directory names in this file, I suggest install into $(FOAM_USER_APPBIN) directory
$> wclean; wmake                                                      # build this solver, for extensive builds, please use a compute node
$> vi job.pbs			                                      # use the new binary "$(FOAM_USER_APPBIN)/myTestFoam" in your batch skript
$> qsub job.pbs                                                       # start the job

Thus you build a solver in the home directory and using the environment variable $(FOAM_USER_APPBIN)/... when starting your application in your workspace.

Starting with Version 1706+, you need to build your solvers on a compute node. One possibility would be to create a batch script or run it in interactive mode, e.g. using:

$> qsub -I -l nodes=1 [-l walltime=03:00:00]
<load the OpenFoam modules>
$> aprun -n 1 wmake

Profile OpenFoam

It is also possible to use CrayPAT profiling for certain version of OpenFOAM. Therefore, specialized module exist providing relevant versions cae/openfoam/xxx-perftools, where xxx are version numbers. The related binaries still has to be instrumented using

$> pat_build -S -f -Dlink-instr=-L$FOAM_LIBBIN/cray_mpich,-lPstream $FOAM_APPBIN/icoFoam

As a result a binary icoFoam+pat is generated in the current directory. Using these binary in the batch script the profiling will be performed. To analyze the resulting profiling data pat_report and further tools can be used (Cray Performance Tools). Furthermore, you need to change your batch script, modifying the application name and add:

export PAT_RT_DSO_MAX=1024