- Infos im HLRS Wiki sind nicht rechtsverbindlich und ohne Gewähr -
- Information contained in the HLRS Wiki is not legally binding and HLRS is not responsible for any damages that might result from its use -

MpiP: Difference between revisions

From HLRS Platforms
Jump to navigationJump to search
 
(4 intermediate revisions by 3 users not shown)
Line 19: Line 19:
| command = module load mpip
| command = module load mpip
}}
}}
which will set the '$MPIP_HOME' variable to the latest mpiP installation path.
which will set the '$HLRS_MPIP_ROOT' variable to the latest mpiP installation path.


The tools mpiP needs to be _attached_ to you application through the "LD_PRELOAD" mechanism. On Hawk in most cases it is sufficient to use the provided wrapper script as  
The tools mpiP needs to be _attached_ to you application through the "LD_PRELOAD" mechanism. On Hawk in most cases it is sufficient to use the provided wrapper script as  
{{Command|command= mpirun $MPIP_HOME/../share/trace-mpiP.sh <your_app and options>}}
{{Command|command= mpirun $HLRS_MPIP_ROOT/../share/trace-mpiP.sh <your_app and options>}}
This will create one or two files with file extension `.mpiP` in the folder. Those files include a consice performance report (`*.1.mpiP`) and an extensive performance report (`*.2.mpiP`), respectively.
This will create one or two files with file extension `.mpiP` in the folder. Those files include a consice performance report (`*.1.mpiP`) and an extensive performance report (`*.2.mpiP`), respectively.


The behaviour of mpiP is controlled by options provided via the `MPIP` environment variable.
The behaviour of mpiP is controlled by options provided via the `MPIP` environment variable.
See the mpiP documentation for a detailed description of all options. The wrapper on Hawk sets sensible defaults already. You may copy the wrapper script and adapt it to your needs if necessary.
See the mpiP documentation for a detailed description of all options. The wrapper on Hawk already sets sensible defaults. But you may override those by exporting your own config as
{{Command|command= export MPIP="CONFIG HERE"; mpirun $HLRS_MPIP_ROOT/../share/trace-mpiP.sh <your_app and options>}}
or copy the wrapper script and adapt it to your needs if necessary.
 
=== Hints ===
Fortran codes that use the mpi constants `MPI_IN_PLACE` and `MPI_BOTTOM` might experience unexpected behavior with mpiP
[https://github.com/LLNL/mpiP/issues/46]


== See also ==
== See also ==
Line 33: Line 39:
== External Links ==
== External Links ==
* [https://software.llnl.gov/mpiP/ mpiP homepage]
* [https://software.llnl.gov/mpiP/ mpiP homepage]
[[Category:Performance Analyzer]]
[[Category:Performance Analyzer]]

Latest revision as of 10:48, 5 July 2023

The mpiP library is a light-weight profiling library for MPI applications.

To record an MPI profile, simply run the program with the mpiP library preloaded.

Developer: LLNL
Platforms: Hazel Hen
Category: Performance Analyzer
License: BSD
Website: mpiP homepage


Note, this page is outdated

Introduction

Usage

First load the related software module

module load mpip

which will set the '$HLRS_MPIP_ROOT' variable to the latest mpiP installation path.

The tools mpiP needs to be _attached_ to you application through the "LD_PRELOAD" mechanism. On Hawk in most cases it is sufficient to use the provided wrapper script as

mpirun $HLRS_MPIP_ROOT/../share/trace-mpiP.sh <your_app and options>

This will create one or two files with file extension `.mpiP` in the folder. Those files include a consice performance report (`*.1.mpiP`) and an extensive performance report (`*.2.mpiP`), respectively.

The behaviour of mpiP is controlled by options provided via the `MPIP` environment variable. See the mpiP documentation for a detailed description of all options. The wrapper on Hawk already sets sensible defaults. But you may override those by exporting your own config as

export MPIP="CONFIG HERE"; mpirun $HLRS_MPIP_ROOT/../share/trace-mpiP.sh <your_app and options>

or copy the wrapper script and adapt it to your needs if necessary.

Hints

Fortran codes that use the mpi constants `MPI_IN_PLACE` and `MPI_BOTTOM` might experience unexpected behavior with mpiP [1]

See also

External Links