Gromacs Profiling using IPM
IPM profile
Mellanox HPC-X is bundled with IPM profile.
Running Gromacs with IPM profile
module load intel/2019.5.281
module load mkl/2019.5.281
module load gcc/8.4.0
module load cmake/3.13.4
module load hpcx/2.6.0
export IPM_DIR=$HPCX_MPI_DIR/tests/ipm-2.0.2
IPM_KEYFILE=$IPM_DIR/etc/ipm_key_mpi
export IPM_REPORT=full
export IPM_LOG=full
export IPM_STATS=all
export IPM_LOGWRITER=serial
export LD_PRELOAD=$IPM_DIR/lib/libipm.so
mpirun -np 1024 -x UCX_NET_DEVICES=mlx5_0:1 -bind-to core -report-bindings \
mdrun_mpi -v -s stmv.tpr -nsteps 10000 -noconfout -nb cpu -pin on |
Â
Parsing IPM profile output
Text report
##IPMv2.0.2########################################################
#
# command : /global/scratch/groups/hpcperf/ISC20/gromacs-2020.2/RUN/../install-hpcx-2.6.0/bin/mdrun_mpi -v -s stmv.tpr -nsteps 10000 -noconfout -nb cpu -pin on
# start : Tue Jun 09 09:02:52 2020 host : helios001.hpcadvisoryco
# stop : Tue Jun 09 09:04:01 2020 wallclock : 68.79
# mpi_tasks : 1024 on 32 nodes %comm : 31.03
# mem [GB] : 394.27 gflop/sec : 0.00
#
# : [total] <avg> min max
# wallclock : 63648.83 62.16 60.57 68.79
# MPI : 19751.87 19.29 13.53 32.31
# %wall :
# MPI : 31.05 21.65 48.22
# #calls :
# MPI : 160397554 156638 116285 422201
# mem [GB] : 394.27 0.39 0.38 0.49
#
# [time] [count] <%wall>
# MPI_Sendrecv 6064.13 99335163 9.53
# MPI_Waitall 5891.29 11667712 9.26
# MPI_Bcast 3580.22 1084448 5.62
# MPI_Recv 2387.46 11837651 3.75
# MPI_Alltoall 730.69 5120512 1.15
# MPI_Allreduce 374.87 47616 0.59
# MPI_Reduce 365.71 904704 0.57
# MPI_Comm_split 271.50 134272 0.43
# MPI_Irecv 30.67 9172224 0.05
# MPI_Scatter 20.37 896 0.03
# MPI_Isend 12.81 20709760 0.02
# MPI_Barrier 10.78 2048 0.02
# MPI_Scatterv 9.32 2688 0.01
# MPI_Gather 1.35 49904 0.00
# MPI_Scan 0.34 1024 0.00
# MPI_Comm_free 0.24 1984 0.00
# MPI_Send 0.13 300115 0.00
# MPI_Comm_size 0.00 10368 0.00
# MPI_Comm_rank 0.00 12417 0.00
# MPI_Init 0.00 1024 0.00
# MPI_Finalize 0.00 1024 0.00
#
###################################################################
To create HTML report:
IPM profile generates an xml file in <filename>.ipm.xml format.
We need to use ploticus (http://ploticus.sourceforge.net/doc/download.html) and ipm_parse.
% module load ploticus/2.42
% $HPCX_MPI_DIR/tests/ipm-2.0.2/bin/ipm_parse -html <filename>.ipm.xml
This will create a folder with index.html.
Open it using a web browswer.
Here is a sample HTML report using 1024 cores.
Â