CosmoCoffee Forum Index CosmoCoffee

 
 FAQFAQ   SearchSearch  MemberlistSmartFeed   MemberlistMemberlist    RegisterRegister 
   ProfileProfile   Log inLog in 
Arxiv New Filter | Bookmarks & clubs | Arxiv ref/author:

[astro-ph/0511647] Apparent Hubble acceleration from large-scale electroweak domain structure
Goto page 1, 2  Next  
Authors:Tommy Anderberg
Abstract:The observed luminosity deficit of Type Ia supernovae (SNe Ia) at high redshift z can be explained by partial conversion to weak vector bosons of photons crossing large-scale electroweak domain boundaries, making Hubble acceleration only apparent and eliminating the need for a cosmological constant > 0.
[PDF] [PS] [BibTex] [Bookmark]

Post new topic   Reply to topic    CosmoCoffee Forum Index -> arXiv papers
View previous topic :: View next topic  
Author Message
Tommy Anderberg



Joined: 24 Nov 2005
Posts: 47
Affiliation: independent

PostPosted: November 24 2005  Reply with quote

An idea which hit me while reading Penrose, back in July. The paper has been bouncing between a growing number of referees ever since - to date without resulting in a single report!

So I thought I'd finally scare out some comments in the open by posting it. Come on, hit me. ;)
Back to top
View user's profile  
Anze Slosar



Joined: 24 Sep 2004
Posts: 205
Affiliation: Brookhaven National Laboratory

PostPosted: November 24 2005  Reply with quote

But we have a lot of indepedent clues about cosmological constant / dark energy, i.e. flatness from CMB + either Omega_m=0.3 or hubble, constant, CMB peaks + baryonic peak in LSS, etc. So any effect that would dim supernovae but doesn't explain acceleration will bring more problems than it solves... Besides, the very high-redhift supernovae show trend towards "less dimming", due to matter to lambda domination transition, exactly as predicted by the standard model which makes most dimming schemes fairly fine-tuned (see e.g. grey dust in riess et al).

This is not saying anything about the physics in your paper which I cannot judge... I am just saying that schemes involving dimming of SN are in general difficult to bring in agreement with all data.
Back to top
View user's profile [ Hidden ] Visit poster's website
Tommy Anderberg



Joined: 24 Nov 2005
Posts: 47
Affiliation: independent

PostPosted: November 24 2005  Reply with quote

Ah, excellent! Thank you for your reply. Let's see if I can stir things up a bit now... :)

Anze Slosar wrote:
But we have a lot of indepedent clues about cosmological constant / dark energy


My outsider's view is that SNe Ia are the linchpin of cosmic acceleration; the rest is data fitting involving a large number of parameters, showing at best that consistency is achievable within the chosen framework (more or less; lately alarmingly less, as pointed out e.g. in astro-ph/0511628). Talking of independent evidence, as I often see done in the literature,
I believe is going too far.

Anze Slosar wrote:
any effect that would dim supernovae but doesn't explain acceleration


I'm afraid I'm suggesting that there really is no acceleration.

Anze Slosar wrote:
the very high-redhift supernovae show trend towards "less dimming"


When interpreted within the framework of the concordance model (and disregarding for the sake of discussion the smallness of the available sample).

What happens if instead you have a discrete set of randomly spaced domain boundaries causing the dimming? Supernovae located close to the boundaries will seem to be fainter than they "should", those located further away brighter.

Anze Slosar wrote:
most dimming schemes fairly fine-tuned (see e.g. grey dust in riess et al)


I am aware of them, and do refer to them in my paper. They are fundamentally different from the domain picture.

Anze Slosar wrote:
schemes involving dimming of SN are in general difficult to bring in agreement with all data


I know, but I am not suggesting that we just explain away supernova dimming and keep the rest as is. Those pesky domain boundaries affect pretty much everything. They are "bad" for matter crossing them in a way reminescent of the Ghostbuster definition of "bad" ("Try to imagine all life as you know it stopping instantaneously and every molecule in your body exploding at the speed of light"). They eat photons and spit out neutrinos, reprocessing CMB photons well after they are supposed to have had their "last scattering". And they have gravitational effects, bending light (remember http://scitation.aip.org/prl/covers/95_7.jsp ? it came out only days after I was done LaTeXing up the paper) and presumably affecting structure formation.

I know, this probably sounds like a hellish proposition: rework everything. But believe it or not, I think I'm pointing to the least painful way out of the Very Big Problem raised by Penrose. The alternatives which I can think of are actually much worse. I can get back to that in a subsequent post, if anyone's interested (I hope so!).
Back to top
View user's profile  
Kate Land



Joined: 27 Sep 2004
Posts: 29
Affiliation: Oxford University

PostPosted: November 25 2005  Reply with quote

Ok, no acceleration. But we still have \Omega_{total}\sim1 and \Omega_{m}\sim0.3.
Where is the rest?
Back to top
View user's profile [ Hidden ] Visit poster's website
Garth Antony Barber



Joined: 19 Jul 2005
Posts: 71
Affiliation: Published independent

PostPosted: November 25 2005  Reply with quote

Kate Land wrote:
Ok, no acceleration. But we still have \Omega_{total}\sim1 and \Omega_{m}\sim0.3.
Where is the rest?

\Omega_{total}\sim1 is based on the WMAP data being consistent with a spatially flat universe, however it is also consistent with a conformally flat one as conformal transformations are angle preserving. In which case it may be that \Omega_{m}\sim0.3 is all there is.
Back to top
View user's profile  
Tommy Anderberg



Joined: 24 Nov 2005
Posts: 47
Affiliation: independent

PostPosted: November 25 2005  Reply with quote

I was just going to suggest that we go through the assumptions leading up to the conclusion \Omega_{total}\sim1, but I see I was beaten to the punch. :-)

Perhaps I can offer a generalization though: CMB data is currently interpreted under the assumption that it's a "fossil" picture of decoupling, reaching us essentially unchanged from the (conventionally defined) surface of last scattering. In the presence of domain boundaries which keep interacting with photons well after decouping from the cosmic medium, this assumption is no longer valid.

Which may help explain things like astro-ph/0510160.
Back to top
View user's profile  
Pier Stefano Corasaniti



Joined: 11 Nov 2004
Posts: 45
Affiliation: LUTH, Observatoire de Paris-Meudon

PostPosted: November 25 2005  Reply with quote

Cosmology is indeed matter of consistency and moving something there implies moving things somewhere else. This is true for the Standard Cosmological paradigm and it is true for any other paradigm you can imagine, "domain universe" included.

As you suggest if you want no acceleration from SN and flat universe you need to reintrepret the CMB as a non-cosmic background. Then is matter of choosing which is the simplest "sensical" scenario which is able to account for the majority of the observations with the minimal effort, sort of application of the usual "Occam razor".

In such a case I would start asking how you are going to get a nice oscillatory pattern in the CMB power spectrum separated at angles which are multiples of the same scale. Then you have to tell me also where this scale comes from. In the Standar cosmological scenario it has a natural explanation and does not need any ad hoc mechanism, it is just the scale that sound waves can propagate over in a barotropic fluid up to photon freeze-out. And I could compute this scale without the need to know all cosmological parameters. "Strangely" enough this scale shows up in the galaxy-galaxy power spectrum as well. Again the Standard scenario offers a unique natural explanation without ad hoc mechanisms. It is simply a reminiscence of the fact that more galaxies formed on scales where there were more compressed baryons than on those where they were not and these are determined by "sound horizon" at decoupling. Another "strangely enough" thing is that consistency between the position of the CMB peaks and the cross-correlation between CMB temperature maps and galaxy surveys gives evidence for a positive signal that can be naturally explained within the Standard scenario as the ISW caused by Dark Energy (i.e. this evidence is independent of the SN). If the ISW-correlation is not caused by dark energy but for instance by negative curvature of the space-time then it would be incosistent with the position of the CMB peaks.

And with these "strangely again" consistency games I could play on and on and on and on...shall we talk about abundances of the light elements, BBN and limits from astrophysical observations and abundaces in early type stars? or the baryon fraction from cluster...

Cosmologists are those who are more opened to new ideas than any other type of scientist in the Universe, however we/they are reluctant to give up to a paradigm that is so powerful as to explain so many phenomena at the same time.

Unfortunately what you are proposing is not a new paradigm, it is just an alternative explanation for only one particular "problem", and when asked about consistency with the rest of cosmological observations this explanation does not tell anything, so as you said one is left with working everything again but only inventing another ad hoc mechanism for each of the remaining observations. My tumb rule is that before starting this game one should always remember that Occam razor is a really sharp one and
the tumb can get easily cut.
Back to top
View user's profile [ Hidden ] Visit poster's website
Tommy Anderberg



Joined: 24 Nov 2005
Posts: 47
Affiliation: independent

PostPosted: November 25 2005  Reply with quote

Pier Stefano Corasaniti wrote:
Cosmology is indeed matter of consistency and moving something there implies moving things somewhere else. This is true for the Standard Cosmological paradigm and it is true for any other paradigm you can imagine, "domain universe" included.


So far so good.

Pier Stefano Corasaniti wrote:
As you suggest if you want no acceleration from SN and flat universe


As pointed out above, flatness is a conclusion based on an interpretation of the data under a set of assumptions, some of which are invalidated by the presence of domain boundaries.

Pier Stefano Corasaniti wrote:
you need to reintrepret the CMB as a non-cosmic background


Not quite. You need to take into consideration that it kept interacting with domain boundaries long after it's conventionally assumed to have decoupled. So it's cosmological, but post-processed in a way which isn't considered in the concordance model.

Pier Stefano Corasaniti wrote:
Then is matter of choosing which is the simplest "sensical" scenario which is able to account for the majority of the observations with the minimal effort, sort of application of the usual "Occam razor".


This is incorrect, and a serious misrepresentation of Occam's razor. The history of science is full of beautiful models which could explain *almost* everything, but which nevertheless had to be discarded because of failure to accomodate a crucial observation. Speaking of the CMB, 19th century physicists had an enormously successful theoretical framework which could explain *almost* everything - but resulted in ultraviolet disaster when applied to black-body radiation, ushering in quantum mechanics. Advance to the 1980s and find high energy theorists celebrating the beauty of SU(5) Grand Unified Theory, apparently capable of explaining *all* known physics (except for GR) based on a single compact group - only to discard this extremely compelling model when patient observation of giant water tanks failed to turn up the proton decays predicted by it. Just one crucial observation was all it took.

In the case at hand, the concordance model of cosmology, Penrose has pointed out that there is a crucial contradiction between it and the standard model of fundamental physics. The latter has been confirmed to an excruciating level of precision over several decades. Cosmology is only now beginning to approach a level where it's possible to make precision tests; the concordance model is called that exactly because speaking of the "standard" model, as you incorrectly do, would imply that it's on a par with the standard model of fundamental physics. It most definitely is not. By Occam's razor, when the two models are in conflict, the one to question first is the concordance model.

[Long list of known properties of the concordance model snipped, since it appears to be presented based only on the misunderstanding that the CMB would cease to be cosmological in the presence of domain boundaries.]

Pier Stefano Corasaniti wrote:
shall we talk about abundances of the light elements, BBN and limits from astrophysical observations and abundaces in early type stars? or the baryon fraction from cluster...


Only if you can come up with a good argument as to why they would be affected by the presence of domain boundaries.

Pier Stefano Corasaniti wrote:
Unfortunately what you are proposing is not a new paradigm, it is just an alternative explanation for only one particular "problem", and when asked about consistency with the rest of cosmological observations this explanation does not tell anything


This is incorrect. On the contrary, it is the concordance model that has a Very Big Problem, at least on the level of proton decay for SU(5) GUT, which needs to be dealt with.

So let's hear it: what's your explanation for the evident lack of readily visible anisotropies in the light reaching us now from sources which haven't been in causal contact since before the electroweak phase transition? What do you do, give up locality and turn all of physics back to the nineteenth century? I don't think so. Give up the standard model of fundamental physics, turning back the clock to the mid−1960s and throwing away all experimental confirmation in its favour accumulated since - and also every proposed extension of it, including supersymmetry and superstrings, which all involve spontaneous symmetry breaking in the low energy limit? Again, I don't think so.

Or do you accept that there are in fact so many domain boundaries between us and the (conventionally defined) surface of last scattering that their individual effects are merged into the residual luminosity distributions presented in my paper? Compared to giving up locality or all post−1960 fundamental physics, this doesn't seem like such a bad option to me.

Pier Stefano Corasaniti wrote:
so as you said one is left with working everything again but only inventing another ad hoc mechanism for each of the remaining observations.


This is again incorrect. I have proposed no "ad hoc mechanism". On the contrary, the domain picture emerges from the combination of big bang and standard model of fundamental physics whether we like it or not - it is in fact nothing more than the Kibble mechanism (known for decades and one of the main reasons why inflation was invented, to dilute away topologically stable GUT-scale defects) minus topological stability.

Pier Stefano Corasaniti wrote:
My tumb rule is that before starting this game one should always remember that Occam razor is a really sharp one and the tumb can get easily cut.


I fully agree. Watch that thumb of yours. ;)
Back to top
View user's profile  
Pier Stefano Corasaniti



Joined: 11 Nov 2004
Posts: 45
Affiliation: LUTH, Observatoire de Paris-Meudon

PostPosted: November 25 2005  Reply with quote

Quote:
This is incorrect, and a serious misrepresentation of Occam's razor. The history of science is full of beautiful models which could explain *almost* everything, but which nevertheless had to be discarded because of failure to accomodate a crucial observation.


I think we have a very different view of what Occam's razor is, it is not about beauty or elegance, but about simplicity. Unless you are referring to a different guy named Occam, because otherwise it s your interpretation to be misleading. As you may easily find out in the literature the Occam's razor, also known as principle of parsinomy or simplicity, relies
on the famous Occam's sentence "plurality should not be assumed without necessity" or in other words "one should not increase, beyond what is necessary, the number of entities required to explain anything".

I give you the credit that a domain universe which still incorporates
a primordial-hot dense phase does not need "perhaps "new mechanisms for explaining the rest such as structure formation or BBN, provided that
the underlying assumption is correct, but the Nobel Laureate Franck
Wilczek has found Penrose's argument on the electroweak symmetry breaking to be wrong. I quote from Science review on Penrose book
by Wilczek (Vol. 307. no. 5711, pp. 852 - 853):

"There are not alternative directions of electroweak symmetry breaking. And no associated disorder arises at that symmetry-breaking transition, any more than at the analogous transition in ordinary superconductors."

Perhaps I should know why it is so, but due to my ignorance on the subject I give you the benefit of the doubt.
And my thumb is perfectly fine. ;0)
Back to top
View user's profile [ Hidden ] Visit poster's website
Tommy Anderberg



Joined: 24 Nov 2005
Posts: 47
Affiliation: independent

PostPosted: November 25 2005  Reply with quote

Of course Occam's razor is about "plurality beyond what is *necessary*". Note the keyword here: necessary. Being able to explain almost everything, but failing in one crucial respect, just doesn't cut it.

Pier Stefano Corasaniti wrote:
I think we have a very different view of what Occam's razor is


If you really believe that Occam's razor is about choosing an arbitrary balance between model complexity and number of inconvenient facts to sweep under the rug, then yes, we have a very different view of what Occam's razor is.

Pier Stefano Corasaniti wrote:
The Nobel Laureate Franck Wilczek has found Penrose's argument on the electroweak symmetry breaking to be wrong. I quote from Science review on Penrose book


We've all seen that, but with all due respect, I'm now seriously beginning to wonder if you even downloaded my paper. In it, I do point out that Penrose's argument *as it stands* in the book appears to have an error in it (I say "appears" because there is actually room for more or less benevolent interpretation). So I do not use it (nor would it be possible to do so, as it's not formulated in a technical way; you know, the kind which lets you extract numbers). I provide my own, while giving credit where credit is due.

Perhaps you should read the paper before responding to it further? Just a thought.
Back to top
View user's profile  
Tommy Anderberg



Joined: 24 Nov 2005
Posts: 47
Affiliation: independent

PostPosted: December 27 2005  Reply with quote

It's been much too quiet here lately, so emerging from holiday-induced stupor, here's my personal Top 3 from December's crop of (maybe) relevant astro-ph papers. In ascending order, they are:

#3: "An inhomogeneous alternative to dark energy?": http://arxiv.org/abs/astro-ph/0512006
This one I find interesting primarily because it illustrates the need to rerun the entire analysis / parameter fitting when you change something big in your cosmology (what I mean by "rework everything"). Putting us near the center of a very special (underdense) region of the universe doesn't seem particularly natural, but a similar conclusion happens to be reached in...

#2: "Anisotropy of z <= 6 Redshifts": http://arxiv.org/abs/astro-ph/0512276
...where the author tries to do something which I would naively expect to be near-impossible, due to sample size and systematics: analyze the 3D distribution of quasars and look for anisotropies (which I'd expect to be caused by electroweak domain boundaries). Statistically significant anisotropies are claimed, putting us roughly 50 Mpc from the center of a "Metagalaxy". Comments from observational experts (perhaps in a separate thread?) about the viability of such studies would be very interesting!

But the top position goes to:

#1: "What Do We Really Know About Cosmic Acceleration?": http://arxiv.org/abs/astro-ph/0512586
...where Shapiro & Turner perform an analysis of SNe Ia data (the usual gold set by Riess et.al.) which they claim to be "model-independent". Of course, it's really no such thing, as Hubble acceleration is postulated to cause the dimming (hellooo?), but there are redeeming features, like the statement that "the present SNeIa data cannot rule out the possibility that the Universe has actually been decelerating for the past 3 Gyr (i.e., since z = 0.3)". Ah yes, z = 0.3. ;) I also like the principal component analysis displayed in Fig. 6. Is that a variable acceleration - or domain boundary positions?

Finally, an honorable mention (out of competition) to blogging stringer Lubos Motl, who manages to work the CMB (along with E=mc2 and climate models) into a discussion about the perils of doing science by parameter fitting:

http://motls.blogspot.com/2005/12/emc2-test-interplay-between-theory-and.html

Well worth a read (no names mentioned, none forgotten).
Back to top
View user's profile  
Tommy Anderberg



Joined: 24 Nov 2005
Posts: 47
Affiliation: independent

PostPosted: March 11 2006  Reply with quote

Major update posted. It adds a brand new section on vacuum realignment, scattered clarifying (?) remarks and more references. If you disliked the first version, you'll hate this one. Get your own copy while they last. ;)
Back to top
View user's profile  
Bruce Bassett



Joined: 24 Nov 2004
Posts: 26
Affiliation: SAAO/UCT

PostPosted: March 30 2006  Reply with quote

Hi,

I have no comment on this specific paper other than to note that essentially all models which "explain" the dimming of SNIa via loss of photons (dust, axion-photon mixing etc...) are ruled out because if they were correct they would induce a major asymmetry between luminosity distance and angular diameter distance measurements, which is just not there. See e.g. astro-ph/0312443 and astro-ph/0311495.

With the recent SDSS LRG measurement of dA the constraints are even tighter now. While there *may* be room for loosing a few percent of photons, there isnt enough room to explain away the acceleration of the cosmos, even if you ignored all the other evidence. With WFMOS and future SNIa probes we will get very tight constraints on loss of any photons and hence these exotic physics models.

Cheers,
Bruce
Back to top
View user's profile [ Hidden ]
Tommy Anderberg



Joined: 24 Nov 2005
Posts: 47
Affiliation: independent

PostPosted: March 30 2006  Reply with quote

Bruce Bassett wrote:
Quote:
essentially all models which "explain" the dimming of SNIa via loss of photons (dust, axion-photon mixing etc...) are ruled out because if they were correct they would induce a major asymmetry between luminosity distance and angular diameter distance measurements, which is just not there. See e.g. astro-ph/0312443 and astro-ph/0311495.


Wow, after all this time, somebody finally throws me a meaty bone!

Thanks, it's very interesting reading. But (you knew there was a but coming, didn't you?) I see two problems, if that's the word:

1) Your analysis and fits seem all to be about continuous dimming, be it caused by grey dust or axion mixing or what have you. What I say comes out of the standard model (not exotic extensions) is discrete: you have photon loss at z = 0.3, then at z = 1, and so on. In between, nothing happens, and your favorite distance relation holds just fine.

2) Fig. 1 in astro-ph/0311495 is based on a comparison between SNe Ia and "radio galaxy data". I can't see any information on wavelengths, but assuming that you're using radio waves, they are many orders of magnitude below the expected energy threshold where domain boundaries become effectively transparent to photons (see third paragraph in my Discussion section). So I would actually worry if your graph did NOT show a difference between them and SNe Ia data.

:)
Back to top
View user's profile  
Bruce Bassett



Joined: 24 Nov 2004
Posts: 26
Affiliation: SAAO/UCT

PostPosted: April 01 2006  Reply with quote

Hi,

In your abstract you state:

"...making Hubble acceleration only apparent and eliminating the need for a cosmological constant > 0"

If this were correct then dA (angular-diameter distance) measurements would indicate that we live in a non-accelerating universe since your effect (and the other similar ideas) have no effect on dA(z). Unfortunately the dA data actually prefer slightly more acceleration than the SNIa do, meaning the acceleration is robust. Since our initial analysis the SDSS BAO dA measurement has come out, consistent with the dA data we used and leading to the same conclusion. It is a shame, because the idea that the cosmos only *looks* like it is accelerating, but actually isnt, is a nice one.

No doubt this will not be convincing to you and you will want to carry on the good fight. Good luck!

b

ps. Have you seen hep-ph/0507020? It also uses domain walls and also doesnt work!
Back to top
View user's profile [ Hidden ]
Display posts from previous:   
Post new topic   Reply to topic    CosmoCoffee Forum Index -> arXiv papers All times are GMT + 5 Hours
Goto page 1, 2  Next
Page 1 of 2

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB © 2001, 2005 phpBB Group. Sponsored by WordWeb online dictionary and dictionary software.