HPC Monte Carlo – OpenMC

Disclaimer: I am not an expert! What follows worked fine for me, but take it carefully, as it might require customization for your specific purposes.

Below are the instructions for creating a Singularity/Apptainer container to run the Monte Carlo code OpenMC on an HPC cluster.

Create a file ‘debian_openmc_mpich-sl.def’ containing the following:

BootStrap: docker
From: debian:12
%environment
    # Point to MPI binaries, libraries, man pages
    export MPI_DIR=/opt/mpi
    export MPI_ROOT=/opt/mpi
    export PATH="$MPI_DIR/bin:$PATH"
    export LD_LIBRARY_PATH="$MPI_DIR/lib:$LD_LIBRARY_PATH"
    export MANPATH="$MPI_DIR/share/man:$MANPATH"
    export OPENMC_CROSS_SECTIONS=/your-data-library-path/cross_sections.xml
    export DEBIAN_FRONTEND=noninteractive
%post
    apt-get update -y
    apt-get install wget git bash gcc gfortran g++ make file bzip2 -y
    apt-get install dpkg-dev cmake libc6 binutils libx11-dev libxpm-dev \
        libxft-dev libxext-dev python3 python3-pip python3-dev python3-numpy \
        libssl-dev libgsl0-dev libtiff-dev libhdf5-dev -y 
    echo "Installing MPI"
    export MPI_DIR=/opt/mpi
    export MPI_ROOT=/opt/mpi
    export MPI_VERSION=3.3.1
    export MPI_URL="https://www.mpich.org/static/downloads/3.3.1/mpich-$MPI_VERSION.tar.gz"
    mkdir -p /tmp/mpi
    mkdir -p /opt
    # Download
    cd /tmp/mpi && wget -O mpich-$MPI_VERSION.tar.gz $MPI_URL && tar -xzf mpich-$MPI_VERSION.tar.gz
    # Compile and install
    export FFLAGS="-w -fallow-argument-mismatch -O2"
    cd /tmp/mpi/mpich-$MPI_VERSION && ./configure --prefix=$MPI_DIR -–enable-shared && make -j$(nproc) install
    # Set env variables so we can compile our application
    export PATH=$MPI_DIR/bin:$PATH
    export LD_LIBRARY_PATH=$MPI_DIR/lib:$LD_LIBRARY_PATH
    cd
    echo "Installing OpenMC"
    git clone --recurse-submodules https://github.com/openmc-dev/openmc.git
    cd openmc
    git checkout master
    mkdir build && cd build
    cmake -DHDF5_PREFER_PARALLEL=on -DOPENMC_USE_MPI=on ..
    make
    make install
    cd ..
    python3 -m pip install mpi4py --break-system-packages
    python3 -m pip install . --break-system-packages
%labels
    Author Valerio Giusti
    Version v0.1.1
%help
    Debian 12 container to run OpenMC on HPC.

Shell command to create the container:

singularity build debian_openmc_mpich-sl.sif debian_openmc_mpich-sl.def

Example of PBS scripts to run criticality or burnup calculations:

#!/usr/bin/env bash
#PBS -N output 
#PBS -j oe
#PBS -l select=4:ncpus=40:mpiprocs=1

module load gnu8/8.3.0 mpich/3.3.1 singularity

# Uncomment next command for criticality calculations
# mpirun singularity exec --bind /your-data-library-path/ \
#        debian_openmc_mpich-sl.sif openmc

# Uncomment next command for burnup calculations
# mpirun singularity exec --bind /your-data-library-path/ \
#        debian_openmc_mpich-sl.sif python3 -m mpi4py python-input-script.py

Remarks

  • Versions of OpenMC released after the third quarter of 2025 will require a Python version greater than 3.11. Then, a Debian version greater than 3.12 is necessary.
  • mpi4py: “On Ubuntu/Debian systems, the mpi4py package uses Open MPI. To use MPICH, install the libmpich-dev and python3-dev packages (and any other required development tools). Afterward, install mpi4py from sources using pip.” It is not necessary to install libmpich-dev if, as in our case, mpich is compiled from the source.