[CosmoMC] Version of MPI required

Use of Cobaya. camb, CLASS, cosmomc, compilers, etc.
Post Reply
Mason Ng
Posts: 26
Joined: May 17 2017
Affiliation: The University of Auckland

[CosmoMC] Version of MPI required

Post by Mason Ng » July 29 2017

As written in a previous thread, I am trying to get the combination CosmoMC + CosmoChord to compile on the cluster (full disclosure: the files for CosmoChord were unpacked onto the CosmoMC folder, and I did the symbolic link for the Planck likelihood, before I built CosmoMC).

I have been working actively with people from the cluster to no avail yet. One question that they would like to get the answer from you to is what version of MPI is required. We have tried versions 2.0.2, 2.0.1 and 1.8.4 for OpenMPI to no avail.

We have looked at the CosmoMC Readme and our reply is: "but the information of which version of MPI to use is missing and the different MPI versions present on the cluster does not seem to work". What we have are:

Intel MPI:

Code: Select all

        impi/4.1.0.027
        impi/4.1.2.040
        impi/5.0.2.044-iccifort-2015.1.133-GCC-4.9.2
        impi/5.0.3.048-iccifort-2015.2.164-GCC-4.9.2
        impi/2017.1.132-GCC-5.4.0
        impi/2017.1.132-GCC-5.4.0-2.26
        impi/2017.1.132-iccifort-2017.1.132-GCC-5.4.0
OpenMPI:

Code: Select all

        OpenMPI/1.4.3-GCC-4.7.3
        OpenMPI/1.6.4-GCC-4.6.4
        OpenMPI/1.6.4-GCC-4.7.2
        OpenMPI/1.6.5-GCC-4.7.2
        OpenMPI/1.6.5-GCC-4.8.2
        OpenMPI/1.6.5-iccifort-2011.13.367
        OpenMPI/1.6.5-iccifort-2013.4.183
        OpenMPI/1.8.4-GCC-4.9.2
        OpenMPI/1.8.4-iccifort-2015.1.133-GCC-4.9.2-i8
        OpenMPI/2.0.1-GCC-5.4.0
        OpenMPI/2.0.2-GCC-6.3.0
There is also iimpi

Code: Select all

        iimpi/7.2.3-GCC-4.9.2
        iimpi/7.2.5-GCC-4.9.2
        iimpi/2017a
and gimpi (just gimpi/2017a).

This is what we get for compiling:

Code: Select all

[wng373@build-wm cosmomc]$ make
cd ./source && make cosmomc BUILD=MPI
make[1]: Entering directory `/gpfs1m/projects/uoa00518/Work/cosmomc/source'
mkdir -p ReleaseMPI
cd ../polychord; make all BUILD=MPI
make[2]: Entering directory `/gpfs1m/projects/uoa00518/Work/cosmomc/polychord'
mpif90 -mkl -openmp -O3 -no-prec-div -fpp -xHost -DMPI -DCLIK -assume noold_maxminloc -c utils.f90
ifort: command line remark #10411: option '-openmp' is deprecated and will be removed in a future release. Please use the replacement option '-qopenmp'
utils.f90(1025): error #7013: This module file was not generated by any release of this compiler.   [MPI]
        use mpi,only: MPI_Wtime
------------^
utils.f90(1031): error #6406: Conflicting attributes or multiple declaration of name.   [MPI_WTIME]
        time = MPI_Wtime()
---------------^
utils.f90(1025): error #6580: Name in only-list does not exist.   [MPI_WTIME]
        use mpi,only: MPI_Wtime
----------------------^
compilation aborted for utils.f90 (code 1)
make[2]: *** [utils.o] Error 1
make[2]: Leaving directory `/gpfs1m/projects/uoa00518/Work/cosmomc/polychord'
make[1]: *** [polychord] Error 2
make[1]: Leaving directory `/gpfs1m/projects/uoa00518/Work/cosmomc/source'
make: *** [cosmomc] Error 2
using the following modules:

Code: Select all

1) OpenBLAS/0.2.13-GCC-4.9.2-LAPACK-3.5.0  11) libpng/1.6.28-gimkl-2017a                  21) libxml2/2.9.4-gimkl-2017a      31) icc/2017.1.132-GCC-5.4.0 
  2) GCCcore/5.4.0                           12) freetype/2.7.1-gimkl-2017a                 22) libxslt/1.1.29-gimkl-2017a     32) ifort/2017.1.132-GCC-5.4.0 
  3) binutils/2.26-GCCcore-5.4.0             13) GEOS/3.6.1-gimkl-2017a                     23) LLVM/3.8.1-gimkl-2017a         33) iccifort/2017.1.132-GCC-5.4.0 
  4) GCC/5.4.0-2.26                          14) Szip/2.1-gimkl-2017a                       24) cURL/7.52.1-gimkl-2017a        34) impi/2017.1.132-iccifort-2017.1.132-GCC-5.4.0 
  5) gimpi/2017a                             15) HDF5/1.8.18-gimkl-2017a                    25) netCDF/4.4.1-gimkl-2017a       35) iimpi/2017a 
  6) gimkl/2017a                             16) libgpuarray/0.6.2-gimkl-2017a-CUDA-8.0.61  26) Tcl/8.6.6-gimkl-2017a          36) imkl/2017.1.132-iimpi-2017a 
  7) bzip2/1.0.6-gimkl-2017a                 17) libjpeg-turbo/1.5.1-gimkl-2017a            27) SQLite/3.16.2-gimkl-2017a      37) intel/2017a 
  8) CUDA/8.0.61                             18) ncurses/6.0-gimkl-2017a                    28) METIS/5.1.0-gimkl-2017a 
  9) FFTW/3.3.5-gimkl-2017a                  19) libreadline/6.3-gimkl-2017a                29) SuiteSparse/4.5.4-gimkl-2017a 
 10) zlib/1.2.11-gimkl-2017a                20) XZ/5.2.3-gimkl-2017a                       30) Python/2.7.13-gimkl-2017a
I think Intel MPI is used initially. We did try the few versions of OpenMPI later on, but we got no resolutions.

Now, if one goes into the Makefile in the cosmomc folder and edit the line from

Code: Select all

 cosmomc: BUILD ?= MPI 
to

Code: Select all

 cosmomc: BUILD ?= 


CosmoMC (plus CosmoChord) seems to build although I get the warning

Code: Select all

 ifort: command line remark #10411: option '-openmp' is deprecated and will be removed in a future release. Please use the replacement option '-qopenmp' 
We think it is not a good idea to change that line in the manner above (though we're also not sure what that'd do). Any suggestions please?

Also, I tried doing a test on test.ini, and I get this output:

Code: Select all

[wng373@build-wm cosmomc]$ mpirun -np 1 ./cosmomc test.ini
 file_root:test
 NOTE: num_massive_neutrinos ignored, using specified hierarchy
 Random seeds: 30685, 11330 rand_inst:   0
 Doing non-linear Pk: F
 Doing CMB lensing: T
 Doing non-linear lensing: F
 TT lmax =  2500
 EE lmax =  2500
 ET lmax =  2500
 BB lmax =  2000
 lmax_computed_cl  =  2500
 Computing tensors: F
 max_eta_k         =    6625.000    
 transfer kmax     =    1.000000    
 adding parameters for: BKPlanck_detset_comb_dust
 Fast divided into            1  blocks
  8 parameters ( 6 slow ( 2 semi-slow),  2 fast ( 0 semi-fast))
 skipped unused params: calPlanck acib217 xi asz143 aps100 aps143 aps143217 aps217 aksz kgal100 kgal143 kgal143217 kgal217 cal0 cal2
 starting nested sampling

PolyChord: Next Generation Nested Sampling
copyright: Will Handley, Mike Hobson & Anthony Lasenby
  version: 1.9
  release: 11th April 2016
    email: wh260@mrao.cam.ac.uk

Run Settings
nlive    :     500
nDims    :       8
nDerived :      44
Doing Clustering
Generating equally weighted posteriors
Generating weighted posteriors
Clustering on posteriors
Writing a resume file tochains//test.resume
Sub clustering on    0 dimensions
 

generating live points


all live points generated

Speed  1 =  0.409E+01 seconds
Speed  2 =  0.106E+00 seconds     
Speed  3 =  0.102E-02 seconds     
number of repeats:            8          82        2142
started sampling


===================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   PID 3168 RUNNING AT build-wm
=   EXIT CODE: 9
=   CLEANING UP REMAINING PROCESSES
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================

===================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   PID 3168 RUNNING AT build-wm
=   EXIT CODE: 9
=   CLEANING UP REMAINING PROCESSES
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
   Intel(R) MPI Library troubleshooting guide:
      https://software.intel.com/node/561764
===================================================================================

Post Reply