MCMC.f90 couldn't start with BAO Likelihood
-
- Posts: 15
- Joined: April 01 2015
- Affiliation: University of Rome - Tor Vergata
MCMC.f90 couldn''t start with BAO Likelihood
Dear Antony,
I have the following problem with CosmoMc:
when I fix some parameters such as ch2 or bh2 in test.ini file, CosmoMc works without any problems.
But when I try to effectuate a running, fixing these parameters with BAO likelihood, CosmoMc crashes with the following error:
"MCMC.f90 couldn't start after 1000 tries - check starting ranges"
With SNLs likelihood, for example, I have not problem!
Could you help me?
Thanks a lot in advance!
I have the following problem with CosmoMc:
when I fix some parameters such as ch2 or bh2 in test.ini file, CosmoMc works without any problems.
But when I try to effectuate a running, fixing these parameters with BAO likelihood, CosmoMc crashes with the following error:
"MCMC.f90 couldn't start after 1000 tries - check starting ranges"
With SNLs likelihood, for example, I have not problem!
Could you help me?
Thanks a lot in advance!
-
- Posts: 1943
- Joined: September 23 2004
- Affiliation: University of Sussex
- Contact:
Re: MCMC.f90 couldn''''t start with BAO Likelihood
Please give the exact change in your .ini file.
-
- Posts: 15
- Joined: April 01 2015
- Affiliation: University of Rome - Tor Vergata
MCMC.f90 couldn''t start with BAO Likelihood
This is the structure of my test.ini file.
I have fixed few parameters and when I turn on SNLS.ini CosmoMc does not crash.
This problems appears with BAO Likelihood.
DEFAULT(batch1/CAMspec_defaults.ini)
DEFAULT(batch1/lowLike.ini)
DEFAULT(batch1/lowl.ini)
#cmb_dataset[BICEP2]=data/BICEP/BICEP2.dataset
#for r>0 also need compute_tensors=T:
compute_tensors = T
#planck lensing
#DEFAULT(batch1/lensing.ini)
#Other Likelihoods
DEFAULT(batch1/BAODR11.ini)
#DEFAULT(batch1/HST.ini)
#DEFAULT(batch1/Union.ini)
#DEFAULT(batch1/SNLS.ini)
#DEFAULT(batch1/WiggleZ_MPK.ini)
#DEFAULT(batch1/MPK.ini)
#general settings
DEFAULT(batch1/common_batch1.ini)
#high for new runs
MPI_Max_R_ProposeUpdate = 30
#propose_matrix= planck_covmats/base_w_planck_lowl_lowLike_BAO.covmat
#Folder where files (chains, checkpoints, etc.) are stored
root_dir = chains/
#Root name for files produced
file_root=test1
#action= 0 runs chains, 1 importance samples, 2 minimizes
#use action=4 just to quickly test likelihoods
action = 0
#if you want to get theory cl for test point
#test_output_root = output_cl_root
start_at_bestfit = F
feedback=1
use_fast_slow = T
#sampling_method=7 is a new fast-slow scheme good for Planck
sampling_method = 7
dragging_steps = 3
propose_scale = 2
#Set >0 to make data files for importance sampling
indep_sample=0
#these are just small speedups for testing
get_sigma8=F
#Uncomment this if you don't want one 0.06eV neutrino by default
#num_massive_neutrinos=
#to vary parameters set param[name]= center, min, max, start width, propose width
param[omegabh2] = 0.0221
param[omegach2] = 0.12
param[theta] = 1.0411
param[w] = -1
#for PICO install from https://pypi.python.org/pypi/pypico and download data file
#cosmology_calculator=PICO
#pico_datafile = pico3_tailmonty_v34.dat
#pico_verbose=F
Thank you Antony!
I have fixed few parameters and when I turn on SNLS.ini CosmoMc does not crash.
This problems appears with BAO Likelihood.
DEFAULT(batch1/CAMspec_defaults.ini)
DEFAULT(batch1/lowLike.ini)
DEFAULT(batch1/lowl.ini)
#cmb_dataset[BICEP2]=data/BICEP/BICEP2.dataset
#for r>0 also need compute_tensors=T:
compute_tensors = T
#planck lensing
#DEFAULT(batch1/lensing.ini)
#Other Likelihoods
DEFAULT(batch1/BAODR11.ini)
#DEFAULT(batch1/HST.ini)
#DEFAULT(batch1/Union.ini)
#DEFAULT(batch1/SNLS.ini)
#DEFAULT(batch1/WiggleZ_MPK.ini)
#DEFAULT(batch1/MPK.ini)
#general settings
DEFAULT(batch1/common_batch1.ini)
#high for new runs
MPI_Max_R_ProposeUpdate = 30
#propose_matrix= planck_covmats/base_w_planck_lowl_lowLike_BAO.covmat
#Folder where files (chains, checkpoints, etc.) are stored
root_dir = chains/
#Root name for files produced
file_root=test1
#action= 0 runs chains, 1 importance samples, 2 minimizes
#use action=4 just to quickly test likelihoods
action = 0
#if you want to get theory cl for test point
#test_output_root = output_cl_root
start_at_bestfit = F
feedback=1
use_fast_slow = T
#sampling_method=7 is a new fast-slow scheme good for Planck
sampling_method = 7
dragging_steps = 3
propose_scale = 2
#Set >0 to make data files for importance sampling
indep_sample=0
#these are just small speedups for testing
get_sigma8=F
#Uncomment this if you don't want one 0.06eV neutrino by default
#num_massive_neutrinos=
#to vary parameters set param[name]= center, min, max, start width, propose width
param[omegabh2] = 0.0221
param[omegach2] = 0.12
param[theta] = 1.0411
param[w] = -1
#for PICO install from https://pypi.python.org/pypi/pypico and download data file
#cosmology_calculator=PICO
#pico_datafile = pico3_tailmonty_v34.dat
#pico_verbose=F
Thank you Antony!
-
- Posts: 1943
- Joined: September 23 2004
- Affiliation: University of Sussex
- Contact:
Re: MCMC.f90 couldn''''t start with BAO Likelihood
Perhaps those fixed values give a background cosmology with BAO likelihoods outside the range supported by the BAO likelihood interpolation files? You can run with action=4 as a quick test.
-
- Posts: 15
- Joined: April 01 2015
- Affiliation: University of Rome - Tor Vergata
MCMC.f90 couldn''t start with BAO Likelihood
I have tried to do various runnings excluding cold dark matter and in these cases CosmoMc operats smoothly.
So there is a compatibility range problem, maybe, as you say!
Thank you again for the suggestion and support Antony!
So there is a compatibility range problem, maybe, as you say!
Thank you again for the suggestion and support Antony!