Prior on tau?
-
- Posts: 60
- Joined: September 25 2004
- Affiliation: ITA, University of Oslo
- Contact:
Prior on tau?
I'm probably a couple of years late compared to everybody else, but here goes anyway :-)
I'm wondering about the prior on tau used by the WMAP team in their first-year analysis. Spergel et al. write in Section 2:
"We assume a flat prior in tau, the optical depth, but bound tau < 0.3. This prior has little effect on the fits but keeps the Markov chain out of unphysical regions of parameter space."
I don't understand this -- what does "unphysical regions" mean here? As an example, the best-fit value used in the computation of the running-index model,
http://lambda.gsfc.nasa.gov/data/map/po ... yr1_v1.txt,
seems to be 0.265 (exp(-2tau) = 0.58862), which to me looks dangerously close to 0.3? What do most people think about this issue?
I'm wondering about the prior on tau used by the WMAP team in their first-year analysis. Spergel et al. write in Section 2:
"We assume a flat prior in tau, the optical depth, but bound tau < 0.3. This prior has little effect on the fits but keeps the Markov chain out of unphysical regions of parameter space."
I don't understand this -- what does "unphysical regions" mean here? As an example, the best-fit value used in the computation of the running-index model,
http://lambda.gsfc.nasa.gov/data/map/po ... yr1_v1.txt,
seems to be 0.265 (exp(-2tau) = 0.58862), which to me looks dangerously close to 0.3? What do most people think about this issue?
-
- Posts: 144
- Joined: September 24 2004
- Affiliation: University College London (UCL)
- Contact:
Prior on tau?
Hi,
I had the impression that although WMAP alone allows tau>0.3 the WMAP team doesn't think that this is possible because otherwise they would have seen lots of power in the EE spectrum (that they don't see).
But this is just a rumour I heard and I haven't seen it officially confirmed anywhere.
I agree it would be nice to have better justification for using the tau<0.3 prior.
Sarah
I had the impression that although WMAP alone allows tau>0.3 the WMAP team doesn't think that this is possible because otherwise they would have seen lots of power in the EE spectrum (that they don't see).
But this is just a rumour I heard and I haven't seen it officially confirmed anywhere.
I agree it would be nice to have better justification for using the tau<0.3 prior.
Sarah
Prior on tau?
I imagine because of the optical depth-primordial tilt degeneracy they're using the prior as a way to keep the chain from wandering off to weird areas of parameter space. With a little effort you can probably make a model with tau=5 that's not really that poor a fit to the WMAP TT data (though still not great fit). But you'd need an enormous tilt (n_S of a few) and a completely crazy \sigma_8 at the very least. That's probably what they mean by "unphysical".
-
- Posts: 29
- Joined: September 25 2004
- Affiliation: Institut d'Estudis Espacials de Catalunya
- Contact:
Prior on tau?
I thought there is a physical motication to discard tau > 0.3
which is based on LSS observations at high-z (difficult to
devise a reionization scenario with such tau that is compatible
with a number of observables).
On the parameter estimation front, as far as we saw (astro-ph/0405589),
the prior on tau (<0.3) does have an impact on the best-fit cosmological parameters. Removing the prior and marginalising values over 6D (vanilla LCDM) parameter space gives: tau =0.2 and h =0.75 which is significantly higher values than those with the prior on tau.
Moreover, the likelihoods in that case show a double peak structure,
though the "second" peak has a lower amplitude (see astro-ph/0310723
in the context of SDSS+WMAP).
which is based on LSS observations at high-z (difficult to
devise a reionization scenario with such tau that is compatible
with a number of observables).
On the parameter estimation front, as far as we saw (astro-ph/0405589),
the prior on tau (<0.3) does have an impact on the best-fit cosmological parameters. Removing the prior and marginalising values over 6D (vanilla LCDM) parameter space gives: tau =0.2 and h =0.75 which is significantly higher values than those with the prior on tau.
Moreover, the likelihoods in that case show a double peak structure,
though the "second" peak has a lower amplitude (see astro-ph/0310723
in the context of SDSS+WMAP).
-
- Posts: 119
- Joined: March 02 2005
- Affiliation: University of Helsinki
Prior on tau?
I have also wondered about the optical depth. Apart from the CMB data, there are constraints from astrophysics, which seem to prefer a low value of tau. For example, astro-ph/0207591 gives 0.18 as the maximum value under assumptions which are called "very conservative".
I am not familiar with the astrophysics, and so don't know how reliable these bounds are. But taking them seriously (for example, by taking a Gaussian prior or a strict upper limit on tau as is done for H_0) would affect the quality of the LCDM fit to the CMB data. This is probably one reason why they are not taken seriously. (If the limit agreed with WMAP, this would be presented as a demonstration of concordance; as there's tension, it's presented at most as unknown systematics!)
On the other hand, allowing a large range for tau opens up new regions of parameter space and so complicates the analysis. I am most familiar with isocurvature models. In astro-ph/0407304 (page 12, fig.16) both upper limits 0.3 and 0.5 are used with various isocurvature models. In both cases, the probability accumulates at the upper end, so that no limit on tau is obtained.
With Francesc Ferrer and Jussi Valiviita, we've fit our isocurvature model discussed in astro-ph/0407300 to CMB and LSS data and have found that the best-fit region is in fact at high tau. (The work is yet unpublished due to the problems of the bimodal distribution, and other analysis issues discussed in astro-ph/0412439.)
It would be interesting to hear if there's a good argument for putting the limit on tau at 0.3.
I am not familiar with the astrophysics, and so don't know how reliable these bounds are. But taking them seriously (for example, by taking a Gaussian prior or a strict upper limit on tau as is done for H_0) would affect the quality of the LCDM fit to the CMB data. This is probably one reason why they are not taken seriously. (If the limit agreed with WMAP, this would be presented as a demonstration of concordance; as there's tension, it's presented at most as unknown systematics!)
On the other hand, allowing a large range for tau opens up new regions of parameter space and so complicates the analysis. I am most familiar with isocurvature models. In astro-ph/0407304 (page 12, fig.16) both upper limits 0.3 and 0.5 are used with various isocurvature models. In both cases, the probability accumulates at the upper end, so that no limit on tau is obtained.
With Francesc Ferrer and Jussi Valiviita, we've fit our isocurvature model discussed in astro-ph/0407300 to CMB and LSS data and have found that the best-fit region is in fact at high tau. (The work is yet unpublished due to the problems of the bimodal distribution, and other analysis issues discussed in astro-ph/0412439.)
It would be interesting to hear if there's a good argument for putting the limit on tau at 0.3.
-
- Posts: 129
- Joined: September 24 2004
- Affiliation: University of Rome
- Contact:
Prior on tau?
Hi,
The point is that in order to have tau>0.3 you need "standard" reionization
happening at z>30 which is a too high redshift since you need structure to reionize the universe.This of course assuming LCDM (but this is what is assumed in the WMAP analysis).
The point is that in order to have tau>0.3 you need "standard" reionization
happening at z>30 which is a too high redshift since you need structure to reionize the universe.This of course assuming LCDM (but this is what is assumed in the WMAP analysis).
-
- Posts: 119
- Joined: March 02 2005
- Affiliation: University of Helsinki
Re: Prior on tau?
How does one get the number 0.3? (The astrophysical papers I have looked at would put the maximum at a lower value.)Alessandro Melchiorri wrote:The point is that in order to have tau>0.3 you need "standard" reionization happening at z>30 which is a too high redshift since you need structure to reionize the universe.
-
- Posts: 129
- Joined: September 24 2004
- Affiliation: University of Rome
- Contact:
Prior on tau?
You assume istantaneous reionization from a redshift z and then
compute the integral for the optical depth (as cmbfast).
You can look at astro-ph/9812125 for example.
cheers
Ale
compute the integral for the optical depth (as cmbfast).
You can look at astro-ph/9812125 for example.
cheers
Ale
-
- Posts: 119
- Joined: March 02 2005
- Affiliation: University of Helsinki
Re: Prior on tau?
That paper discusses a specific model with a fixed baryon number, adiabatic perturbations and so on. As noted, there are explicit examples of models where tau can be much higher.Alessandro Melchiorri wrote:You assume istantaneous reionization from a redshift z and then
compute the integral for the optical depth (as cmbfast).
You can look at astro-ph/9812125 for example.
You wrote that
The point is that in order to have tau>0.3 you need "standard" reionization happening at z>30 which is a too high redshift since you need structure to reionize the universe.
Can one show this in a way that is independent of the assumed power spectrum and cosmological parameters?
-
- Posts: 16
- Joined: November 06 2004
- Affiliation: CITA
- Contact:
Prior on tau?
No.
If I understand correctly, tau<0.3 is just a way of (weakly) combining WMAP with other observations without quite saying that you are (i.e., you know that adding any one of many other observations will rule out this part of parameter space, so you feel safe in dropping it).
If I understand correctly, tau<0.3 is just a way of (weakly) combining WMAP with other observations without quite saying that you are (i.e., you know that adding any one of many other observations will rule out this part of parameter space, so you feel safe in dropping it).
-
- Posts: 129
- Joined: September 24 2004
- Affiliation: University of Rome
- Contact:
Re: Prior on tau?
No in principle but If you look at Fig.1 of the paper I mentioned you see that the dependence on Omega_Lambda is rather small. Very blue spectral indeces (n>1.2) are excluded by LSS data. So, in the framework of the LCDM models considered by the WMAP team, I think tau<0.3 is a conservative prior. I don't think the picture will change much if you add an isocurvature component (constrained by BBN anyway).Syksy Rasanen wrote: Can one show this in a way that is independent of the assumed power spectrum and cosmological parameters?
ciao
ale