I know that getdist corrects for the bias introduced by smoothing over parameter hard boundaries. But I'm still failing to see how this significantly changes the answer I am getting in the scenario below.
Given a
Code: Select all
getdist.mcsamples.MCSamples
Code: Select all
getdist.densities.Density2D
Code: Select all
getdist_Density2D=getdist_samples.get2DDensity(pars[0],pars[1],normalized=False)
interpolator = RectBivariateSpline(getdist_Density2D.x, getdist_Density2D.y, getdist_Density2D.P)
Ptheta0 = interpolator.ev(fiducial_point[0], fiducial_point[1])
Code: Select all
getdist.densities.Density2D. getContourLevels()
For visual inspection, I can plot the found CL and it should be very close to the fiducial point, looking like the attached `no_boundary_case.png`, for example.
However, this procedure would fail when parameters have hard boundaries imposed by their priors, e.g. [w0,wa]. The situation then looks like the `with_boundary_case.png` attached.
FWIW, here is the jupyter notebook:
https://github.com/MinhMPA/getdist/blob/master/getdist_analysis_pipelines_and_tutorials/compute_significance_from_chains.ipynb
with the full routine. The result in the notebook corresponds to the case without boundaries.
*I know that `getdist.densities.Density2D` is supposed to support this procedure directly through the method `Prob()` but it never works for me in 2D as it returns `None` always.
Any insight would be greatly appreciated.