general CosmoMC question

Use of Cobaya. camb, CLASS, cosmomc, compilers, etc.
Post Reply
Brian Powell
Posts: 18
Joined: August 08 2006
Affiliation: IPMU, U. of Tokyo

general CosmoMC question

Post by Brian Powell » September 19 2007

Hello

I am trying to constrain a base model to which I have added a couple of fast parameters. Being unsure of the distributions of these new parameters, and being without MPI, I ran 4 test chains with a covmat with 0 entries for the new params. The plan was to let these chains run for a while then generate a new covmat from them. I have a couple questions about this:

1) I stopped the chains after about a week of computing (1 cpu/chain) with about 22,000 lines in the .txt files. Some of the new parameters had poor R-1 values (worst being around 3). When I plot the mean likelihoods of the parameters, I notice that many (even the seemingly well converged ones with R-1 < 0.2) seem to get cut off on either side of the maximum at modest values of the likelihood (around 0.2). Is this normal?

2) Seeing as some of the parameters did not converge well in this time frame, can the new covmat be trusted?

3) Since each chain converged to the same max likelihood, I'm quite confident that this is the global maximum. In order to save time on burn-in, I was thinking about restarting my new run with each chain starting out at the best fit point. I have set the start widths to 0.0 in params.ini, but for some reason the chains don't seem to respect this, starting at different places in the parameter space.

I would vastly appreciate any help!

Thanks
Brian

Antony Lewis
Posts: 1941
Joined: September 23 2004
Affiliation: University of Sussex
Contact:

Re: general CosmoMC question

Post by Antony Lewis » September 24 2007

Firstly, you may be able to use MPI even on a single processor machine (will just be slow). This would automated the solution to most of your problems.

The likelihood should not be cut off unless you have some hard priors. The covmat does not have to be accurate to be useful.

Unless you are continuing after a checkpoint, the propose width should be used to set the starting point - not sure why you're not seeing that.

Brian Powell
Posts: 18
Joined: August 08 2006
Affiliation: IPMU, U. of Tokyo

general CosmoMC question

Post by Brian Powell » September 29 2007

Thanks for the reply. I will try to get MPI running on my single cpu machine.
I have been trying to constrain a couple of additional parameters that have very small prior ranges (the values are unphysical outside these ranges).
I have started a new set of chains running and have some general and probably stupid questions:

1) I tried starting this new run with the covmat from my old run. The step-sizes for some of my new parameters are quite large with the new covmat, and the chains all seem to get stuck quite far away from the best-fit point with acceptance rates < 10^-2. If I restart the chains without the covmat but instead set smaller standard deviations in the params.ini file, the chains converge to the best fit point quickly. Each chain has around 30K rows in the txt files, with an acceptance rate of around 0.18. Is it OK to simply abandon the covmat like this? Is this current acceptance rate too high?

2) Without running with MPI, in general how does one know when to stop the chains? I've been periodically checking on them by running getdist on the chains and looking at the R-1 values of the eigenvalues. Many are still quite large, and when I plot the 1D distros the parameter that has converged the least is multipeaked, and looks pretty nasty. My question is whether I might expect the convergence to improve by continuing to let the chains run a bit more or whether the poor convergence is due to the fact that this parameter is either highly correlated with others, poorly constrained, or simply has a crazy posterior.

Thank you very much for any help.

Brian

Post Reply