POT3D for ISC23 SCC (Onsite)

Overview

Potential Magnetic Field Solution using POT3D

POT3D is a Fortran code that computes potential field solutions to approximate the solar coronal magnetic field using observed photospheric magnetic fields as a boundary condition. It can be used to generate potential field source surface (PFSS), potential field current sheet (PFCS), and open field (OF) models. It has been (and continues to be) used for numerous studies of coronal structure and dynamics. The code is highly parallelized using MPI and is GPU-accelerated using Fortran standard parallelism (do concurrent) and OpenACC, along with an option to use the NVIDIA cuSparse library. The HDF5 file format is used for input/output.

 

 

Presentation

Downloading and compiling POT3D

git clone https://github.com/predsci/POT3D

There are sample build scripts in the build_examples directory.

We start with build_examples/build_cpu_mpi-only_intel_ubuntu20.04.sh (This is for CPU-only runs).

  1. Copy and rename build_examples/build_cpu_mpi-only_intel_ubuntu20.04.sh to the POT3D package root directory:
    cp ./build_examples/build_cpu_mpi-only_intel_ubuntu20.04.sh ./rebuild.sh

  2. Build and/or load the HDF5 library.

  3. Modify the rebuild.sh script to point to the location of the HDF5 library.

  4. Execute the build script:
    ./rebuild.sh

Sample build script:

We used modules for MPI and HDF5.

# MPI=openmpi-4.1.2-intel2021.4.0 MPI=intelmpi-2021.7.0 if [[ "$MPI" =~ ^intel ]]; then module load hdf5/1.10.7-impi-intel export I_MPI_F90=ifort else module load hdf5/1.10.7-ompi-intel export OMPI_MPIF90=ifort fi ################################################################# # Location of local hdf5 installed with same compiler being used for POT3D: HDF5_INCLUDE_DIR="$HDF5_ROOT/include" HDF5_LIB_DIR="$HDF5_ROOT/lib" # Fortran HDF5 library flags (these can be version dependent): HDF5_LIB_FLAGS="-lhdf5_fortran -lhdf5_hl_fortran -lhdf5 -lhdf5_hl"" ...

Running an Example

The following commands will run a test case with NP MPI ranks and validate that the code is working.

NP=1 cd testsuite POT3D_HOME=$PWD/.. TEST="small" cp ${POT3D_HOME}/testsuite/${TEST}/input/* ${POT3D_HOME}/testsuite/${TEST}/run/ cd ${POT3D_HOME}/testsuite/${TEST}/run echo "Running POT3D with $NP MPI rank..." mpirun -np $NP ${POT3D_HOME}/bin/pot3d > pot3d.log echo "Done!" # Get runtime: runtime=($(tail -n 5 timing.out | head -n 1)) echo "Wall clock time: ${runtime[6]} seconds" echo " " # Validate run: ${POT3D_HOME}/scripts/pot3d_validation.sh pot3d.out ${POT3D_HOME}/testsuite/${TEST}/validation/pot3d.out

 

Task and Submission

  1. Use the input under testsuite/isc2023 folder.

  2. Run POT3D with isc2023 input.
    Note: ifprec=2 with cuSparse is allowed to use.
    Your Residual should be 1E-12 or smaller.

    ### The CG solver has converged. Iteration: 25112 Residual: 9.972489313728662E-13
  3. Submission and Presentation:
    - Submit all your build scripts, run scripts, inputs, and output text files (pot3d.dat, pot3d.out, timing.out, etc.)
    - Do not submit the output HDF5 data or the source code.