## CosmoMC: .data files and post-processing

Use of Healpix, camb, CLASS, cosmomc, compilers, etc.
Hiranya Peiris
Posts: 54
Joined: September 28 2004
Affiliation: University College London

### CosmoMC: .data files and post-processing

Hi Antony,

Is there a way to produce thinned .data files after the fact from converged chains, once you know the correlations and the appropriate thinning factor? (there wasn't a .data file produced during the original runs). And could you please give me an example of the "redo" type input options that are used in post-processing, for example in a case where you wanted to remove a dataset and replace it by another (say remove CBI/ACBAR/VSA and replace with B03)?

Thanks!
Hiranya

Antony Lewis
Posts: 1591
Joined: September 23 2004
Affiliation: University of Sussex
Contact:

### Re: CosmoMC: .data files and post-processing

You need .data files from the original runs to use redo (set indep_sample non-zero). .data files will be thinned by the indep_sample factor.
Redo will produce additionally thinned versions of the .data files according to your redo_thin parameters. However usually you can just use all the indep_sample samples unless things are very slow for some reason.

To redo you set action=1, and to change data only (not recalculating the theory spectra) you'd have e.g.

Code: Select all

redo_likelihoods = T
redo_theory = F
redo_cls = F
redo_pk = F
redo_skip = 0
redo_outroot = newchains
redo_thin = 1

(obviously you also need to change which data sets are being used in the .ini file). The newchains chains will just be importance sampled to whatever new dataset combination you have chosen.

Hiranya Peiris
Posts: 54
Joined: September 28 2004
Affiliation: University College London

### CosmoMC: .data files and post-processing

Thanks for that. So there is no way to make the .data files from an existing chain (without redo)? I didn't make them in the original run. It must be pretty simple to hack together a code which reads in a chain and outputs .data for the models in that chain. Can you please point me to the subroutine which makes the file?

Antony Lewis
Posts: 1591
Joined: September 23 2004
Affiliation: University of Sussex
Contact:

### Re: CosmoMC: .data files and post-processing

There's no code to go from plain text chain files to .data files.

You could hack, but probably not as easy as re-running your original chain! See WriteModel in cmbtypes.f90.

Hiranya Peiris
Posts: 54
Joined: September 28 2004
Affiliation: University College London

### CosmoMC: .data files and post-processing

So to produce the .data file with no thinning, I set indep_sample=1, right?

Can you please give an example of where redo_likeoffset would need to be greater than 0? You mean a large difference in log likelihoods logL_A and logL_B of a given model in the initial chains (data set A) and the post processed chains (data set B)?

Thanks!

Antony Lewis
Posts: 1591
Joined: September 23 2004
Affiliation: University of Sussex
Contact:

### Re: CosmoMC: .data files and post-processing

Hiranya Peiris wrote:So to produce the .data file with no thinning, I set indep_sample=1, right?
Yes, though usually you want to thin a bit to save time and disk space. (very correlated samples, so thinning by factor less than correlation length should be fine)
Hiranya Peiris wrote: Can you please give an example of where redo_likeoffset would need to be greater than 0? You mean a large difference in log likelihoods logL_A and logL_B of a given model in the initial chains (data set A) and the post processed chains (data set B)?
Yes. It's very obvious when needed as the output weights will come out tiny (or huge) if it is wrong.

Hiranya Peiris
Posts: 54
Joined: September 28 2004
Affiliation: University College London

### CosmoMC: .data files and post-processing

So I created some new chains with .data and am trying it out. I set the redo_likeoffset to the difference between the maximum likelihood models of the two data sets.

The post-processing chains stop after 1 model, giving me a warning for setting redo_likeoffset. Its clear some already bad models are going to be ruled out by added data which now gives 0 weight to that model, so why does it give this error and stop after one model? Any thoughts about a better way to set redo_likeoffset?

Antony Lewis
Posts: 1591
Joined: September 23 2004
Affiliation: University of Sussex
Contact:

### Re: CosmoMC: .data files and post-processing

It should only give a warning about offsets after processing the whole chain - so it suggests there is some problem with your .data files or settings (e.g. skipping all of the models or something). Or your whole chain is getting importance weights of zero (e.g. models ruled out by a new prior).

Hiranya Peiris
Posts: 54
Joined: September 28 2004
Affiliation: University College London

### CosmoMC: .data files and post-processing

I set indep_sample=1 and redo_thin=1. I also asked it to skip 100 models for burn-in. It says its only processed one model. The size of the .data files vs .txt files indicates all models are written into .data files. What other option could I have set wrong for skipping the entire chain?

Antony Lewis
Posts: 1591
Joined: September 23 2004
Affiliation: University of Sussex
Contact:

### Re: CosmoMC: .data files and post-processing

Unless you have less than 100 models in the .data files I'm not sure. You'll have to investigate what's going on, e.g. by stepping through lines in postprocess.f90.

Hiranya Peiris
Posts: 54
Joined: September 28 2004
Affiliation: University College London

### CosmoMC: .data files and post-processing

Hi Antony,

Just wanted to let you know that the problem has been fixed, and it was due to the likelihood routine accessing some LUNs that were already in use by cosmomc. Thanks for your help.

Cheers
Hiranya

gongbo zhao
Posts: 69
Joined: January 04 2005
Affiliation: ICG, Portsmouth
Contact:

### CosmoMC: .data files and post-processing

Dear all,
I am now facing with the same problem with Hiranya. The post-process stopped after one model . I have many chains running on a supercomputer so I wonder the easiest way to solve the problem rather than killing the chains to free LUNs?

Cheers,
Gong-Bo Zhao