The results using the test run for the code that was compiled eariler is:

While for the exact same version of the code with the same param file, compiled later:egould@compute:~/CosmoMC/cosmomc[tex] mpirun -np 1 ./cosmomcCDM test_planck.ini

--------------------------------------------------------------------------

[[42207,1],0]: A high-performance Open MPI point-to-point messaging module

was unable to find any relevant network interfaces:

Module: OpenFabrics (openib)

Host: compute

Another transport will be used instead, although this may result in

lower performance.

--------------------------------------------------------------------------

Number of MPI processes: 1

file_root:test

Random seeds: 13894, 17063 rand_inst: 1

Using clik with likelihood file

./data/clik/hi_l/plik/plik_dx11dr2_HM_v18_TT.clik

----

clik version 6dc2a8cf3965

smica

Checking likelihood './data/clik/hi_l/plik/plik_dx11dr2_HM_v18_TT.clik' on test data. got -380.979 expected -380.979 (diff -8.68545e-09)

----

TT from l=0 to l= 2508

Clik will run with the following nuisance parameters:

A_cib_217

cib_index

xi_sz_cib

A_sz

ps_A_100_100

ps_A_143_143

ps_A_143_217

ps_A_217_217

ksz_norm

gal545_A_100

gal545_A_143

gal545_A_143_217

gal545_A_217

calib_100T

calib_217T

A_planck

Using clik with likelihood file

./data/clik/low_l/bflike/lowl_SMW_70_dx11d_2014_10_03_v5c_Ap.clik

BFLike Ntemp = 2876

BFLike Nq = 1407

BFLike Nu = 1407

BFLike Nside = 16

BFLike Nwrite = 32393560

cls file appears to have 5+ columns

assuming it is a CAMB file with l, TT, EE, BB, TE

info = 0

----

clik version 6dc2a8cf3965

bflike_smw

Checking likelihood './data/clik/low_l/bflike/lowl_SMW_70_dx11d_2014_10_03_v5c_Ap.clik' on test data. got -5247.87 expected -5247.87 (diff 3.93804e-07)

----

TT from l=0 to l= 29

EE from l=0 to l= 29

BB from l=0 to l= 29

TE from l=0 to l= 29

Clik will run with the following nuisance parameters:

A_planck

Doing non-linear Pk: F

Doing CMB lensing: T

Doing non-linear lensing: T

TT lmax = 2508

EE lmax = 2500

ET lmax = 2500

BB lmax = 2500

PP lmax = 2500

lmax_computed_cl = 2508

Computing tensors: F

max_eta_k = 14000.00

transfer kmax = 5.000000

adding parameters for: lowl_SMW_70_dx11d_2014_10_03_v5c_Ap

adding parameters for: smica_g30_ftl_full_pp

adding parameters for: BKPlanck_detset_comb_dust

adding parameters for: plik_dx11dr2_HM_v18_TT

Fast divided into 1 blocks

23 parameters ( 9 slow ( 0 semi-slow), 14 fast ( 0 semi-fast))

Time for theory: 2.68312

Time for lowl_SMW_70_dx11d_2014_10_03_v5c_Ap: 1.16145515441895

Time for smica_g30_ftl_full_pp: 8.458137512207031E-003

Time for BKPlanck_detset_comb_dust: 1.250982284545898E-003

Time for plik_dx11dr2_HM_v18_TT: 2.827286720275879E-002

loglike chi-sq

22.258 44.516 CMB: BKPLANCK = BKPlanck_detset_comb_dust

6.079 12.157 CMB: lensing = smica_g30_ftl_full_pp

581.392 1162.783 CMB: plik = plik_dx11dr2_HM_v18_TT

5249.288 10498.575 CMB: lowTEB = lowl_SMW_70_dx11d_2014_10_03_v5c_Ap

Test likelihoods done, total logLike, chi-eq = 5859.141 11718.281

Expected likelihoods, total logLike, chi-eq = 5859.141 11718.282

...OK, delta = -3.248257817176636E-004

Likelihood calculation time (seconds)= 4.1661

Total time: 70 ( 0.01953 hours )

egould@compute:~/CosmoMC/cosmomc[/tex] mpirun -np 1 ./cosmomc test_planck.ini

--------------------------------------------------------------------------

[[44843,1],0]: A high-performance Open MPI point-to-point messaging module

was unable to find any relevant network interfaces:

Module: OpenFabrics (openib)

Host: compute

Another transport will be used instead, although this may result in

lower performance.

--------------------------------------------------------------------------

Number of MPI processes: 1

file_root:test

Random seeds: 2781, 16844 rand_inst: 1

Using clik with likelihood file

./data/clik/hi_l/plik/plik_dx11dr2_HM_v18_TT.clik

----

clik version 6dc2a8cf3965

smica

Checking likelihood './data/clik/hi_l/plik/plik_dx11dr2_HM_v18_TT.clik' on test data. got -380.979 expected -380.979 (diff -8.68545e-09)

----

TT from l=0 to l= 2508

Clik will run with the following nuisance parameters:

A_cib_217

cib_index

xi_sz_cib

A_sz

ps_A_100_100

ps_A_143_143

ps_A_143_217

ps_A_217_217

ksz_norm

gal545_A_100

gal545_A_143

gal545_A_143_217

gal545_A_217

calib_100T

calib_217T

A_planck

Using clik with likelihood file

./data/clik/low_l/bflike/lowl_SMW_70_dx11d_2014_10_03_v5c_Ap.clik

forrtl: No such file or directory

forrtl: severe (29): file not found, unit 42, file /xfs1/applications/planck-2015-data/plc_2.0/low_l/bflike/lowl_SMW_70_dx11d_2014_10_03_v5c_Ap.clik/clik/lkl_0/_external/fort.42

Image PC Routine Line Source

cosmomc 0000000000661263 Unknown Unknown Unknown

libifcoremt.so.5 00007FBA4963365C Unknown Unknown Unknown

libclik.so 00007FBA49C2963F Unknown Unknown Unknown

libclik.so 00007FBA49BE6573 Unknown Unknown Unknown

libclik.so 00007FBA49BC2D86 Unknown Unknown Unknown

libclik.so 00007FBA49B8AAEF Unknown Unknown Unknown

libclik.so 00007FBA49B8173F Unknown Unknown Unknown

libclik_f90.so 00007FBA4EE8740F Unknown Unknown Unknown

libclik_f90.so 00007FBA4EE8BBD4 Unknown Unknown Unknown

cosmomc 0000000000502650 Unknown Unknown Unknown

cosmomc 00000000004FF527 Unknown Unknown Unknown

cosmomc 0000000000552CD5 Unknown Unknown Unknown

cosmomc 000000000058CEBD Unknown Unknown Unknown

cosmomc 000000000059613C Unknown Unknown Unknown

cosmomc 000000000040ECDE Unknown Unknown Unknown

libc.so.6 00007FBA4A108EC5 Unknown Unknown Unknown

cosmomc 000000000040EBD9 Unknown Unknown Unknown

--------------------------------------------------------------------------

mpirun has exited due to process rank 0 with PID 3334 on

node compute exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in

the job did. This can cause a job to hang indefinitely while it waits

for all processes to call "init". By rule, if one process calls "init",

then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".

By rule, all processes that call "init" MUST call "finalize" prior to

exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be

terminated by signals sent by mpirun (as reported here).

--------------------------------------------------------------------------