getdist error:importance sampling PLANCK chains with BICEP2

Use of Cobaya. camb, CLASS, cosmomc, compilers, etc.
Post Reply
Akhilesh Nautiyal(akhi)
Posts: 72
Joined: June 13 2007
Affiliation: Malaviya National Institute of Technology Jaipur

getdist error:importance sampling PLANCK chains with BICEP2

Post by Akhilesh Nautiyal(akhi) » September 01 2014

Hi,

I am trying to do importance sampling of PLANCK chains with BICEP2 data using cosmomc.
I am able to get the new chains but when I am ruing them, I am getting the following error.

skipped unused params: omegak mnu nnu yhe Alens
reading chains/base_nrun_r/planck_lowl_lowLike_highL/base_nrun_r_planck_lowl_lo
wLike_highL_post_BICEP2_1.txt
error reading line 0 - skipping to next row
error reading line 0 - skipping to next row
error reading line 0 - skipping to next row
error reading line 0 - skipping to next row
error reading line 0 - skipping to next row
error reading line 0 - skipping to next row
error reading line 0 - skipping to next row
error reading line 0 - skipping to next row
error reading line 0 - skipping to next row
error reading line 0 - skipping to next row

I am using the following options with action = 1.
redo_likelihoods = F
redo_theory = F
redo_cls = F
redo_pk = F
redo_skip = 0
redo_outroot =
redo_thin = 1
redo_add = F
redo_from_text = T
#If large difference in log likelihoods may need to offset to give sensible weights
#for exp(difference in likelihoods)
redo_likeoffset = 0

When running cosmomc with action = 1. I am getting the following output.
TT from l=0 to l= 32
EE from l=0 to l= 32
BB from l=0 to l= 32
TE from l=0 to l= 32
**You probably want to set redo_theory**
reading from: /home/mohanty/cosmomc_july14/cosmomc/chains/base_nrun_r/planck_lo
wl_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_8.txt
writing to: /home/mohanty/cosmomc_july14/cosmomc/chains/base_nrun_r/planck_lowl
_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_post_BICEP2_8.*
Using temperature: 1.00000000000000
starting post processing
**You probably want to set redo_theory**
reading from: /home/mohanty/cosmomc_july14/cosmomc/chains/base_nrun_r/planck_lo
wl_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_1.txt
writing to: /home/mohanty/cosmomc_july14/cosmomc/chains/base_nrun_r/planck_lowl
_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_post_BICEP2_1.*
Using temperature: 1.00000000000000
Initialising BBN Helium data...
Done. Interpolation table is 48 by 13
Initialising BBN Helium data...
Done. Interpolation table is 48 by 13
Initialising BBN Helium data...
Done. Interpolation table is 48 by 13
Initialising BBN Helium data...
Done. Interpolation table is 48 by 13
Initialising BBN Helium data...
Done. Interpolation table is 48 by 13
Initialising BBN Helium data...
Done. Interpolation table is 48 by 13
Initialising BBN Helium data...
Done. Interpolation table is 48 by 13
Initialising BBN Helium data...
Done. Interpolation table is 48 by 13
finished. Processed 13850 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.16404332129964
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 1248.83333333333
Best redo_likeoffset = 0.000000000000000E+000
finished. Processed 13785 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.17584330794342
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 1304.08695652174
Best redo_likeoffset = 0.000000000000000E+000
finished. Processed 13844 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.16411441779832
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 907.878787878788
Best redo_likeoffset = 0.000000000000000E+000
finished. Processed 13742 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.18847329355261
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 1503.70000000000
Best redo_likeoffset = 0.000000000000000E+000
finished. Processed 13722 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.18889374726716
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 1430.28571428571
Best redo_likeoffset = 0.000000000000000E+000
finished. Processed 13820 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.16649782923300
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 1761.23529411765
Best redo_likeoffset = 0.000000000000000E+000
finished. Processed 13853 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.17093770302462
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 1582.84210526316
Best redo_likeoffset = 0.000000000000000E+000
finished. Processed 13942 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.16052216324774
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 1255.08333333333
Best redo_likeoffset = 0.000000000000000E+000
Postprocesing done
Total time: 875 ( 0.24300 hours )

I will be grateful if some one can help me in this regard.

Thanks,
akhilesh

Antony Lewis
Posts: 1941
Joined: September 23 2004
Affiliation: University of Sussex
Contact:

Re: getdist error:importance sampling PLANCK chains with BIC

Post by Antony Lewis » September 01 2014

Did you try setting redo_theory like the warning suggests?

Akhilesh Nautiyal(akhi)
Posts: 72
Joined: June 13 2007
Affiliation: Malaviya National Institute of Technology Jaipur

getdist error:importance sampling PLANCK chains with BICEP2

Post by Akhilesh Nautiyal(akhi) » September 02 2014

Hi Antony,

Thanks for reply.
I tried with redo_theory=T
Still I am having problem. This time I am getting the segmentation fault error after running getdist. I complied cosmomc to enable debugging flags.
Then after running getdist I am getting the following output.

akhilesh@cosmos:~/cmbsofts/cosmomc_july14/cosmomc$ python python/runGridGetdist.py chains --burn_removed

skipped unused params: omegak mnu nnu yhe Alens
producing files in directory /home/akhilesh/cmbsofts/cosmomc_july14/cosmomc/cha
ins/base_nrun_r/planck_lowl_lowLike_highL/dist/
reading /home/akhilesh/cmbsofts/cosmomc_july14/cosmomc/chains/base_nrun_r/planc
k_lowl_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_post_BICEP2_1.txt
reading /home/akhilesh/cmbsofts/cosmomc_july14/cosmomc/chains/base_nrun_r/planc
k_lowl_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_post_BICEP2_2.txt
reading /home/akhilesh/cmbsofts/cosmomc_july14/cosmomc/chains/base_nrun_r/planc
k_lowl_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_post_BICEP2_3.txt
reading /home/akhilesh/cmbsofts/cosmomc_july14/cosmomc/chains/base_nrun_r/planc
k_lowl_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_post_BICEP2_4.txt
reading /home/akhilesh/cmbsofts/cosmomc_july14/cosmomc/chains/base_nrun_r/planc
k_lowl_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_post_BICEP2_5.txt
reading /home/akhilesh/cmbsofts/cosmomc_july14/cosmomc/chains/base_nrun_r/planc
k_lowl_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_post_BICEP2_6.txt
reading /home/akhilesh/cmbsofts/cosmomc_july14/cosmomc/chains/base_nrun_r/planc
k_lowl_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_post_BICEP2_7.txt
reading /home/akhilesh/cmbsofts/cosmomc_july14/cosmomc/chains/base_nrun_r/planc
k_lowl_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_post_BICEP2_8.txt
Number of chains used = 8
var(mean)/mean(var), remaining chains, worst e-value: R-1 = 0.00627
RL: Thin for Markov: 33
RL: Thin for indep samples: 34
RL: Estimated burn in steps: 138 (64 rows)
mean input multiplicity = 2.17237106315237
Random seeds: 28240, 8483 rand_inst: 0
forrtl: warning (402): fort: (1): In call to I/O Write routine, an array temporary was created for argument #1

forrtl: warning (402): fort: (1): In call to I/O Write routine, an array temporary was created for argument #1

forrtl: warning (402): fort: (1): In call to I/O Write routine, an array temporary was created for argument #1

forrtl: severe (408): fort: (2): Subscript #1 of the array BINSRAW has value 7 which is greater than the upper bound of -1

Image PC Routine Line Source
getdist 00000000005FABC3 mcsamples_mp_get1 1127 GetDist.f90
getdist 0000000000654789 MAIN__ 2380 GetDist.f90
getdist 00000000004043E6 Unknown Unknown Unknown
libc.so.6 00007F60BF72D76D Unknown Unknown Unknown
getdist 00000000004042D9 Unknown Unknown Unknown


The cosmomc output this time was

Checking likelihood './data/clik/lowlike_v222.clik' on test data. got -1007.04 expected -1007.04 (diff -1.87824e-07)
----
TT from l=0 to l= 32
EE from l=0 to l= 32
BB from l=0 to l= 32
TE from l=0 to l= 32
reading from: /home/mohanty/cosmomc_july14/cosmomc/chains/base_nrun_r/planck_lo
wl_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_5.txt
writing to: /home/mohanty/cosmomc_july14/cosmomc/chains/base_nrun_r/planck_lowl
_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_post_BICEP2_5.*
Using temperature: 1.00000000000000
reading from: /home/mohanty/cosmomc_july14/cosmomc/chains/base_nrun_r/planck_lo
wl_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_4.txt
writing to: /home/mohanty/cosmomc_july14/cosmomc/chains/base_nrun_r/planck_lowl
_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_post_BICEP2_4.*
Using temperature: 1.00000000000000
reading from: /home/mohanty/cosmomc_july14/cosmomc/chains/base_nrun_r/planck_lo
wl_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_6.txt
writing to: /home/mohanty/cosmomc_july14/cosmomc/chains/base_nrun_r/planck_lowl
_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_post_BICEP2_6.*
Using temperature: 1.00000000000000
Initialising BBN Helium data...
Done. Interpolation table is 48 by 13
Initialising BBN Helium data...
Done. Interpolation table is 48 by 13
Initialising BBN Helium data...
Done. Interpolation table is 48 by 13
finished. Processed 13785 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.17584330794342
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 1304.08695652174
Best redo_likeoffset = 0.000000000000000E+000
finished. Processed 13844 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.16411441779832
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 907.878787878788
Best redo_likeoffset = 0.000000000000000E+000
finished. Processed 13850 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.16404332129964
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 1248.83333333333
Best redo_likeoffset = 0.000000000000000E+000
finished. Processed 13853 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.17093770302462
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 1582.84210526316
Best redo_likeoffset = 0.000000000000000E+000
finished. Processed 13820 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.16649782923300
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 1761.23529411765
Best redo_likeoffset = 0.000000000000000E+000
finished. Processed 13742 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.18847329355261
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 1503.70000000000
Best redo_likeoffset = 0.000000000000000E+000
finished. Processed 13722 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.18889374726716
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 1430.28571428571
Best redo_likeoffset = 0.000000000000000E+000
finished. Processed 13942 models
max weight= 1.00000000000000 min weight = 1.00000000000000
mean mult = 2.16052216324774
mean importance weight (approx evidence ratio) = 1.00000000000000
effective number of samples = 1255.08333333333
Best redo_likeoffset = 0.000000000000000E+000
Postprocesing done
Total time: 1739 ( 0.48312 hours )


I will be grateful if you can help me.

Thanks,
Akhilesh

Antony Lewis
Posts: 1941
Joined: September 23 2004
Affiliation: University of Sussex
Contact:

Re: getdist error:importance sampling PLANCK chains with BIC

Post by Antony Lewis » September 02 2014

You probably want redo_cls as well.

Akhilesh Nautiyal(akhi)
Posts: 72
Joined: June 13 2007
Affiliation: Malaviya National Institute of Technology Jaipur

getdist error:importance sampling PLANCK chains with BICEP2

Post by Akhilesh Nautiyal(akhi) » September 02 2014

After doing redo_cls =T
I am getting the following warning
Checking lensing likelihood './data/clik/lensing_likelihood_v4_ref.clik_lensing' on te
st data. got -5.81038
lensing lmax: 2048
reading from: /home/mohanty/cosmomc_july14/cosmomc/chains/base_nrun_r/planck_lo
wl_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_post_lensing_3.txt
writing to: /home/mohanty/cosmomc_july14/cosmomc/chains/base_nrun_r/planck_lowl
_lowLike_highL/base_nrun_r_planck_lowl_lowLike_highL_post_BICEP2_3.*
Using temperature: 1.00000000000000
Initialising BBN Helium data...
Done. Interpolation table is 48 by 13
WARNING: NaN CL?
WARNING: NaN CL?
WARNING: NaN CL?
WARNING: NaN CL?
WARNING: NaN CL?

akhilesh

Antony Lewis
Posts: 1941
Joined: September 23 2004
Affiliation: University of Sussex
Contact:

Re: getdist error:importance sampling PLANCK chains with BIC

Post by Antony Lewis » September 04 2014

I don't know if this is relevant, but there is a bug in ImportanceSampling.f90- replace

Code: Select all

                    Params%P= BaseParams%center
with

Code: Select all

                    Params%P(:num_params)= BaseParams%center

Akhilesh Nautiyal(akhi)
Posts: 72
Joined: June 13 2007
Affiliation: Malaviya National Institute of Technology Jaipur

getdist error:importance sampling PLANCK chains with BICEP2

Post by Akhilesh Nautiyal(akhi) » September 04 2014

Hi Antony,

Thanks for the reply. I did it but still I am getting the same warning with redo_cls and same segmentation fault without redo_cls.

Akhilesh

Post Reply