PETSc

Installation

PETSc (v3.4, v3.5, v3.6, v3.7, v3.10) is installed on Thor. v3.4 has 2 current installations, one with intel mpi configuration and one with sgi mpt configration. v3.5, v3.6 and v3.7 is compiled with mpt configuration
Each of them has all of the petsc items (linear solver, non linear solver, ode integration, ...), and add different external packages for linear system and IO library. The directory installation is /sw/lib/petsc/3.4/opt/, and a module is configured to set environment correctly, depending of the current MPI module:

[log@thor petsc-3.7.2]$ module av lib/petsc

------------------------------------- /sw/Modules/modulefiles -------------------------------------
lib/petsc/3.4.3-opt lib/petsc/3.5.2-opt lib/petsc/3.6.3-opt lib/petsc/3.7.2-opt
 [log@thor petsc-3.7.2]$ module load lib/petsc/3.6.3-opt

version 3.10

Version debug/optimized intelmpi 18, optimized mpt 2.10

opt with mpt/2.10

Compilers:
  C Compiler:         /opt/sgi/mpt/mpt-2.10/bin/mpicc  -fPIC  -wd1572
  C++ Compiler:       /opt/sgi/mpt/mpt-2.10/bin/mpicxx  -wd1572   -fPIC
  Fortran Compiler:   /opt/sgi/mpt/mpt-2.10/bin/mpif90  -fPIC
Linkers:
  Shared linker:   /opt/sgi/mpt/mpt-2.10/bin/mpicxx  -shared
  Dynamic linker:   /opt/sgi/mpt/mpt-2.10/bin/mpicxx  -shared
make:
BLAS/LAPACK: -Wl,-rpath,/opt/intel/2018/compilers_and_libraries_2018.2.199/linux/mkl/lib/intel64 -L/opt/intel/2018/compilers_and_libraries_2018.2.199/linux/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread
MPI:
cmake:
zlib:
  Includes: -I/sw/lib/petsc/3.10/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/opt/mpt/lib -L/sw/lib/petsc/3.10/opt/mpt/lib -lz
hdf5:
  Includes: -I/sw/lib/petsc/3.10/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/opt/mpt/lib -L/sw/lib/petsc/3.10/opt/mpt/lib -lhdf5_hl -lhdf5
netcdf:
  Includes: -I/sw/lib/petsc/3.10/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/opt/mpt/lib -L/sw/lib/petsc/3.10/opt/mpt/lib -lnetcdf
Chaco:
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/opt/mpt/lib -L/sw/lib/petsc/3.10/opt/mpt/lib -lchaco
metis:
  Includes: -I/sw/lib/petsc/3.10/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/opt/mpt/lib -L/sw/lib/petsc/3.10/opt/mpt/lib -lmetis
parmetis:
  Includes: -I/sw/lib/petsc/3.10/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/opt/mpt/lib -L/sw/lib/petsc/3.10/opt/mpt/lib -lparmetis
PTScotch:
  Includes: -I/sw/lib/petsc/3.10/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/opt/mpt/lib -L/sw/lib/petsc/3.10/opt/mpt/lib -lptesmumps -lptscotchparmetis -lptscotch -lptscotcherr -lesmumps -lscotch -lscotcherr
MUMPS:
  Includes: -I/sw/lib/petsc/3.10/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/opt/mpt/lib -L/sw/lib/petsc/3.10/opt/mpt/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord
scalapack:
  Library:  -Wl,-rpath,/opt/intel/2018/compilers_and_libraries_2018.2.199/linux/mkl/lib/intel64 -L/opt/intel/2018/compilers_and_libraries_2018.2.199/linux/mkl/lib/intel64 -Wl,--start-group -Wl,--end-group -lmkl_scalapack_lp64 -lmkl_intel_lp64 -lmkl_core -lmkl_sequential -lmkl_blacs_sgimpt_lp64 -lpthread -lm
SuiteSparse:
  Includes: -I/sw/lib/petsc/3.10/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/opt/mpt/lib -L/sw/lib/petsc/3.10/opt/mpt/lib -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig
X:
  Library:  -lX11
hypre:
  Includes: -I/sw/lib/petsc/3.10/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/opt/mpt/lib -L/sw/lib/petsc/3.10/opt/mpt/lib -lHYPRE
pthread:
ml:
  Includes: -I/sw/lib/petsc/3.10/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/opt/mpt/lib -L/sw/lib/petsc/3.10/opt/mpt/lib -lml
boost:
  Includes: -I/sw/lib/petsc/3.10/opt/mpt/include
  Arch:
fftw:
  Includes: -I/sw/lib/petsc/3.10/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/opt/mpt/lib -L/sw/lib/petsc/3.10/opt/mpt/lib -lfftw3_mpi -lfftw3
mkl_sparse:
mkl_sparse_optimize:
sundials:
  Includes: -I/sw/lib/petsc/3.10/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/opt/mpt/lib -L/sw/lib/petsc/3.10/opt/mpt/lib -lsundials_cvode -lsundials_nvecserial -lsundials_nvecparallel
tetgen:
  Includes: -I/sw/lib/petsc/3.10/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/opt/mpt/lib -L/sw/lib/petsc/3.10/opt/mpt/lib -ltet
PETSc:
  PETSC_ARCH: opt_mpt
  PETSC_DIR: /home/gueguenm/codes/petsc-3.10.5
  Scalar type: real
  Precision: double
  Clanguage: Cxx
  Integer size: 32
  shared libraries: enabled
  Memory alignment: 16

debug intelmpi18

[gueguenm@thor petsc-3.10.5]$ module li
Currently Loaded Modulefiles:
  1) intel-cc-18/18.2.199      4) intel-tools-18/18.2.199   7) utils/cmake-3.16.0
  2) intel-fc-18/18.2.199      5) intel-mpi-18/18.2.199
  3) intel-cmkl-18/18.2.199    6) utils/conda
BOOST:
  Download           : Downloaded BOOST into /home/log/codes/petsc-3.10.5/dbg_intel/externalpackages/boost_1_61_0
  Install            : Installed BOOST into /sw/lib/petsc/3.10/dbg/impi
ZLIB:
  Download           : Downloaded ZLIB into /home/log/codes/petsc-3.10.5/dbg_intel/externalpackages/zlib-1.2.11
  Install            : Installed ZLIB into /sw/lib/petsc/3.10/dbg/impi
PTSCOTCH:
  Download           : Downloaded PTSCOTCH into /home/log/codes/petsc-3.10.5/dbg_intel/externalpackages/git.ptscotch
  Install            : Installed PTSCOTCH into /sw/lib/petsc/3.10/dbg/impi
METIS:
  Download           : Downloaded METIS into /home/log/codes/petsc-3.10.5/dbg_intel/externalpackages/git.metis
  Install            : Installed METIS into /sw/lib/petsc/3.10/dbg/impi
PARMETIS:
  Download           : Downloaded PARMETIS into /home/log/codes/petsc-3.10.5/dbg_intel/externalpackages/git.parmetis
  Install            : Installed PARMETIS into /sw/lib/petsc/3.10/dbg/impi
CHACO:
  Download           : Downloaded CHACO into /home/log/codes/petsc-3.10.5/dbg_intel/externalpackages/Chaco-2.2-p2
  Install            : Installed CHACO into /sw/lib/petsc/3.10/dbg/impi
HDF5:
  Download           : Downloaded HDF5 into /home/log/codes/petsc-3.10.5/dbg_intel/externalpackages/hdf5-1.8.18
  Install            : Installed HDF5 into /sw/lib/petsc/3.10/dbg/impi
NETCDF:
  Download           : Downloaded NETCDF into /home/log/codes/petsc-3.10.5/dbg_intel/externalpackages/netcdf-4.5.0
  Install            : Installed NETCDF into /sw/lib/petsc/3.10/dbg/impi
FFTW:
  Download           : Downloaded FFTW into /home/log/codes/petsc-3.10.5/dbg_intel/externalpackages/fftw-3.3.7
  Install            : Installed FFTW into /sw/lib/petsc/3.10/dbg/impi
MUMPS:
  Download           : Downloaded MUMPS into /home/log/codes/petsc-3.10.5/dbg_intel/externalpackages/git.mumps
  Install            : Installed MUMPS into /sw/lib/petsc/3.10/dbg/impi
PETSc:
  Build              : Set default architecture to dbg_intel in lib/petsc/conf/petscvariables
  File creation      : Created dbg_intel/lib/petsc/conf/reconfigure-dbg_intel.py for automatic reconfiguration
    Pushing language C
    Popping language C
    Pushing language Cxx
    Popping language Cxx
    Pushing language FC
    Popping language FC
Compilers:
  C Compiler:         /opt/intel/2018/impi/2018.2.199/bin64/mpiicc  -fPIC  -wd1572 "-xavx" 
  C++ Compiler:       /opt/intel/2018/impi/2018.2.199/bin64/mpiicpc  -wd1572 "-xavx"   -fPIC
  Fortran Compiler:   /opt/intel/2018/impi/2018.2.199/bin64/mpiifort  -fPIC "-xavx" 
Linkers:
  Shared linker:   /opt/intel/2018/impi/2018.2.199/bin64/mpiicc  -shared  -fPIC  -wd1572 "-xavx" 
  Dynamic linker:   /opt/intel/2018/impi/2018.2.199/bin64/mpiicc  -shared  -fPIC  -wd1572 "-xavx" 
make:
BLAS/LAPACK: -Wl,-rpath,/opt/intel/2018/compilers_and_libraries_2018.2.199/linux/mkl/lib/intel64 -L/opt/intel/2018/compilers_and_libraries_2018.2.199/linux/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread
MPI:
cmake:
zlib:
  Includes: -I/sw/lib/petsc/3.10/dbg/impi/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/dbg/impi/lib -L/sw/lib/petsc/3.10/dbg/impi/lib -lz
hdf5:
  Includes: -I/sw/lib/petsc/3.10/dbg/impi/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/dbg/impi/lib -L/sw/lib/petsc/3.10/dbg/impi/lib -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5
netcdf:
  Includes: -I/sw/lib/petsc/3.10/dbg/impi/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/dbg/impi/lib -L/sw/lib/petsc/3.10/dbg/impi/lib -lnetcdf
Chaco:
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/dbg/impi/lib -L/sw/lib/petsc/3.10/dbg/impi/lib -lchaco
metis:
  Includes: -I/sw/lib/petsc/3.10/dbg/impi/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/dbg/impi/lib -L/sw/lib/petsc/3.10/dbg/impi/lib -lmetis
parmetis:
  Includes: -I/sw/lib/petsc/3.10/dbg/impi/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/dbg/impi/lib -L/sw/lib/petsc/3.10/dbg/impi/lib -lparmetis
PTScotch:
  Includes: -I/sw/lib/petsc/3.10/dbg/impi/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/dbg/impi/lib -L/sw/lib/petsc/3.10/dbg/impi/lib -lptesmumps -lptscotchparmetis -lptscotch -lptscotcherr -lesmumps -lscotch -lscotcherr
MUMPS:
  Includes: -I/sw/lib/petsc/3.10/dbg/impi/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/dbg/impi/lib -L/sw/lib/petsc/3.10/dbg/impi/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord
scalapack:
  Library:  -Wl,-rpath,/opt/intel/2018/compilers_and_libraries_2018.2.199/linux/mkl/lib/intel64 -L/opt/intel/2018/compilers_and_libraries_2018.2.199/linux/mkl/lib/intel64 -Wl,--start-group -Wl,--end-group -lmkl_scalapack_lp64 -lmkl_intel_lp64 -lmkl_core -lmkl_avx2 -lmkl_blacs_intelmpi_lp64 -lpthread -lm
X:
  Library:  -lX11
pthread:
boost:
  Includes: -I/sw/lib/petsc/3.10/dbg/impi/include
  Arch:
fftw:
  Includes: -I/sw/lib/petsc/3.10/dbg/impi/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.10/dbg/impi/lib -L/sw/lib/petsc/3.10/dbg/impi/lib -lfftw3_mpi -lfftw3
mkl_sparse:
mkl_sparse_optimize:
PETSc:
  PETSC_ARCH: dbg_intel
  PETSC_DIR: /home/log/codes/petsc-3.10.5
  Scalar type: real
  Precision: double
  Clanguage: C
  Integer size: 32
  shared libraries: enabled
  Memory alignment: 16

version 3.7

Installation de la version 3.7.2 de petsc avec de nombreux composants supplémentaires (icc/icpc/ifort) :

Compilers:
  C Compiler:         /opt/sgi/mpt/mpt-2.10/bin/mpicc  -fPIC  -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden
  C++ Compiler:       /opt/sgi/mpt/mpt-2.10/bin/mpicxx  -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden   -fPIC
  Fortran Compiler:   /opt/sgi/mpt/mpt-2.10/bin/mpif90  -fPIC
Linkers:
  Shared linker:   /opt/sgi/mpt/mpt-2.10/bin/mpicxx  -shared
  Dynamic linker:   /opt/sgi/mpt/mpt-2.10/bin/mpicxx  -shared
MPI:
make:
BLAS/LAPACK: -Wl,-rpath,/opt/intel/composer_xe_2013_sp1.2.144/mkl/lib/intel64 -L/opt/intel/composer_xe_2013_sp1.2.144/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm
cmake:
  Arch:
metis:
  Includes: -I/sw/lib/petsc/3.7/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.7/opt/mpt/lib -L/sw/lib/petsc/3.7/opt/mpt/lib -lmetis
MOAB:
  Includes: -I/sw/lib/petsc/3.7/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.7/opt/mpt/lib -L/sw/lib/petsc/3.7/opt/mpt/lib -liMesh -lMOAB
hdf5:
  Includes: -I/sw/lib/petsc/3.7/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.7/opt/mpt/lib -L/sw/lib/petsc/3.7/opt/mpt/lib -lhdf5_hl -lhdf5
netcdf:
  Includes: -I/sw/lib/petsc/3.7/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.7/opt/mpt/lib -L/sw/lib/petsc/3.7/opt/mpt/lib -lnetcdf
PTScotch:
  Includes: -I/sw/lib/petsc/3.7/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.7/opt/mpt/lib -L/sw/lib/petsc/3.7/opt/mpt/lib -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr
SuiteSparse:
  Includes: -I/sw/lib/petsc/3.7/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.7/opt/mpt/lib -L/sw/lib/petsc/3.7/opt/mpt/lib -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig
X:
  Library:  -lX11
MUMPS:
  Includes: -I/sw/lib/petsc/3.7/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.7/opt/mpt/lib -L/sw/lib/petsc/3.7/opt/mpt/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord
parmetis:
  Includes: -I/sw/lib/petsc/3.7/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.7/opt/mpt/lib -L/sw/lib/petsc/3.7/opt/mpt/lib -lparmetis
scalapack:
  Library:  -Wl,-rpath,/opt/intel/composer_xe_2013_sp1.2.144/mkl/lib/intel64 -L/opt/intel/composer_xe_2013_sp1.2.144/mkl/lib/intel64 -lmkl_scalapack_lp64 -Wl,--start-group -lmkl_intel_lp64 -lmkl_core -lmkl_sequential -Wl,--end-group -lmkl_blacs_sgimpt_lp64 -lpthread -lm
SuperLU:
  Includes: -I/sw/lib/petsc/3.7/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.7/opt/mpt/lib -L/sw/lib/petsc/3.7/opt/mpt/lib -lsuperlu
SuperLU_DIST:
  Includes: -I/sw/lib/petsc/3.7/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.7/opt/mpt/lib -L/sw/lib/petsc/3.7/opt/mpt/lib -lsuperlu_dist
hypre:
  Includes: -I/sw/lib/petsc/3.7/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.7/opt/mpt/lib -L/sw/lib/petsc/3.7/opt/mpt/lib -lHYPRE
boost:
  Includes: -I/sw/lib/petsc/3.7/opt/mpt/include
pthread:
ssl:
  Library:  -lssl -lcrypto
tetgen:
  Includes: -I/sw/lib/petsc/3.7/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.7/opt/mpt/lib -L/sw/lib/petsc/3.7/opt/mpt/lib -ltet
sundials:
  Includes: -I/sw/lib/petsc/3.7/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.7/opt/mpt/lib -L/sw/lib/petsc/3.7/opt/mpt/lib -lsundials_cvode -lsundials_nvecserial -lsundials_nvecparallel
Chaco:
  Library:  -Wl,-rpath,/sw/lib/petsc/3.7/opt/mpt/lib -L/sw/lib/petsc/3.7/opt/mpt/lib -lchaco
fftw:
  Includes: -I/sw/lib/petsc/3.7/opt/mpt/include
  Library:  -Wl,-rpath,/sw/lib/petsc/3.7/opt/mpt/lib -L/sw/lib/petsc/3.7/opt/mpt/lib -lfftw3_mpi -lfftw3
PETSc:
  PETSC_ARCH: opt_mpt
  PETSC_DIR: /home/log/codes/petsc-3.7.2
  Scalar type: real
  Precision: double
  Clanguage: Cxx
  shared libraries: enabled
  Integer size: 32
  Memory alignment: 16

PETSC_VERSION_RELEASE    1
PETSC_VERSION_MAJOR      3
PETSC_VERSION_MINOR      7
PETSC_VERSION_SUBMINOR   2
PETSC_VERSION_PATCH      0
PETSC_VERSION_DATE       "Jun, 05, 2016" 
PETSC_VERSION_GIT        "v3.7.2" 
PETSC_VERSION_DATE_GIT   "2016-06-05 12:07:54 -0500" 

Test

solve linear system

  • in PETSc source, iterative solver :
                cd $PETSC_DIR/src/ksp/ksp/examples/tutorials
                make ex2f #Description: Solves a linear system in parallel with KSP (Fortran code).
Here is the help from the code :
                /* Program usage:  mpiexec -n <procs> ex2 [-help] [all PETSc options]*/  
                static char help[] = "Solves a linear system in parallel with KSP.\n\
                Input parameters include:\n\
                  -random_exact_sol : use a random exact solution vector\n\
                  -view_exact_sol   : write exact solution vector to stdout\n\
                  -m <mesh_x>       : number of mesh points in x-direction\n\
                  -n <mesh_n>       : number of mesh points in y-direction\n\n";

This example compute the matrix and right-hand-side vector that define
the linear system, Ax = b. Create parallel matrix, specifying only its global dimensions.
When using MatCreate(), the matrix format can be specified at
runtime. Also, the parallel partitioning of the matrix is
determined by PETSc at runtime.
  • PBS script example for PETSc/mpt :
    #!/bin/bash
    #PBS -N petsc_test 
    #PBS -l select=20:ncpus=20:mpiprocs=20
    #PBS -l place=scatter:excl
    #PBS -l walltime=04:00:00
    #PBS -j oe
    #PBS -m abe -M homer.simpson@springfield.fr
    
    module purge
    module load intel-tools-14/14.0.2.144
    module load mpt/2.10
    module load lib/petsc/3.4.3
    
    cd ${PBS_O_WORKDIR}
    #Affichage dans le fichier de sortie de plusieurs informations
    NCPU=`wc -l < $PBS_NODEFILE`
    
    CODE=/home/homer/petsc-3.4.3/src/ksp/ksp/examples/tutorials/ex2f_mpt 
    #Run 
    echo "------------------" 
    date
    echo "on utilise mpi :" 
    which mpiexec_mpt
    echo ------------------------------------------------------
    
    M=5000
    N=5000
    
    mpiexec_mpt -n 400 $CODE -m $M -n $N -log_summary log_mpt_n400_$PBS_JOBID.txt  > ex2f_mpt_n400.out.$PBS_JOBID 2>&1
    
  • PBS script example for PETSc/intel mpi :
     #!/bin/bash
    #PBS -N petsc_test 
    #PBS -l select=20:ncpus=20:mpiprocs=20
    #PBS -l place=scatter:excl
    #PBS -l walltime=02:00:00
    #PBS -j oe
    
    module purge
    module load intel-tools-14/14.0.2.144
    module load intel-mpi-4/4.1.3.048
    module load lib/petsc/3.4.3
    source mpivars.sh
    
    cd ${PBS_O_WORKDIR}
    #Affichage dans le fichier de sortie de plusieurs informations
    NCPU=`wc -l < $PBS_NODEFILE`
    
    CODE=/home/homer/petsc-3.4.3/src/ksp/ksp/examples/tutorials/ex2f_impi 
    M=5000
    N=5000
    
    mpirun -np ${NCPU} $CODE -m $M -n $N -log_summary log_impi_n400_$PBS_JOBID.txt  > ex2f_impi_n400.out.$PBS_JOBID 2>&1    
    
  • accélération du code fonction du nombre de processus pour M=N=5000

petsc_comp_impl.png (49.5 KB) Gueguen Mikael, 05/22/2015 04:33 PM