Wednesday, March 25, 2015

No, the LHC will not make contact with parallel universes

Evidence for rainbow gravity by butterfly
production at the LHC.

The most recent news about quantum gravity phenomenology going through the press is that the LHC upon restart at higher energies will make contact with parallel universes, excuse me, with PARALLEL UNIVERSES. The telegraph even wants you to believe that this would disprove the Big Bang, and tomorrow maybe it will cause global warming, cure Alzheimer and lead to the production of butterflies at the LHC, who knows. This story is so obviously nonsense that I thought it would be unnecessary to comment on this, but I have underestimated the willingness of news outlets to promote shallow science, and also the willingness of authors to feed that fire.

This story is based on the paper:
    Absence of Black Holes at LHC due to Gravity's Rainbow
    Ahmed Farag Ali, Mir Faizal, Mohammed M. Khalil
    arXiv:1410.4765 [hep-th]
    Phys.Lett. B743 (2015) 295
which just got published in PLB. Let me tell you right away that this paper would not have passed my desk. I'd have returned it as major revisions necessary.

Here is a summary of what they have done. In models with large additional dimensions, the Planck scale, where effects of quantum gravity become important, can be lowered to energies accessible at colliders. This is an old story that was big 15 years ago or so, and I wrote my PhD thesis on this. In the new paper they use a modification of general relativity that is called "rainbow gravity" and revisit the story in this framework.

In rainbow gravity the metric is energy-dependent which it normally is not. This energy-dependence is a non-standard modification that is not confirmed by any evidence. It is neither a theory nor a model, it is just an idea that, despite more than a decade of work, never developed into a proper model. Rainbow gravity has not been shown to be compatible with the standard model. There is no known quantization of this approach and one cannot describe interactions in this framework at all. Moreover, it is known to lead to non-localities with are ruled out already. For what I am concerned, no papers should get published on the topic until these issues have been resolved.

Rainbow gravity enjoys some popularity because it leads to Planck scale effects that can affect the propagation of particles, which could potentially be observable. Alas, no such effects have been found. No such effects have been found if the Planck scale is the normal one! The absolutely last thing you want to do at this point is argue that rainbow gravity should be combined with large extra dimensions, because then its effects would get stronger and probably be ruled out already. At the very least you would have to revisit all existing constraints on modified dispersion relations and reaction thresholds and so on. This isn't even mentioned in the paper.

That isn't all there is to say though. In their paper, the authors also unashamedly claim that such a modification has been predicted by Loop Quantum Gravity, and that it is a natural incorporation of effects found in string theory. Both of these statements are manifestly wrong. Modifications like this have been motivated by, but never been derived from Loop Quantum Gravity. And String Theory gives rise to some kind of minimal length, yes, but certainly not to rainbow gravity; in fact, the expression of the minimal length relation in string theory is known to be incompatible with the one the authors use. The claims that this model they use has some kind of derivation or even a semi-plausible motivation from other theories is just marketing. If I had been a referee of this paper, I would have requested that all these wrong claims be scraped.

In the rest of the paper, the authors then reconsider the emission rate of black holes in extra dimension with the energy-dependent metric.

They erroneously state that the temperature diverges when the mass goes to zero and that it comes to a "catastrophic evaporation". This has been known to be wrong since 20 years. This supposed catastrophic evaporation is due to an incorrect thermodynamical treatment, see for example section 3.1 of this paper. You do not need quantum gravitational effects to avoid this, you just have to get thermodynamics right. Another reason to not publish the paper. To be fair though, this point is pretty irrelevant for the rest of the authors' calculation.

They then argue that rainbow gravity leads to black hole remnants because the temperature of the black hole decreases towards the Planck scale. This isn't so surprising and is something that happens generically in models with modifications at the Planck scale, because they can bring down the final emission rate so that it converges and eventually stops.

The authors then further claim that the modification from rainbow gravity affects the cross-section for black hole production, which is probably correct, or at least not wrong. They then take constraints on the lowered Planck scale from existing searches for gravitons (ie missing energy) that should also be produced in this case. They use the contraints obtained from the graviton limits to say that with these limits, black hole production should not yet have been seen, but might appear in the upcoming LHC runs. They should not of course have used the constaints from a paper that were obtained in a scenario without the rainbow gravity modification, because the production of gravitons would likewise be modified.

Having said all that, the conclusion that they come to that rainbow gravity may lead to black hole remnants and make it more difficult to produce black holes is probably right, but it is nothing new. The reason is that these types of models lead to a generalized uncertainty principle, and all these calculations have been done before in this context. As the authors nicely point out, I wrote a paper already in 2004 saying that black hole production at the LHC should be suppressed if one takes into account that the Planck length acts as a minimal length.

Yes, in my youth I worked on black hole production at the LHC. I gracefully got out of this when it became obvious there wouldn't be black holes at the LHC, some time in 2005. And my paper, I should add, doesn't work with rainbow gravity but with a Lorentz-invariant high-energy deformation that only becomes relevant in the collision region and thus does not affect the propagation of free particles. In other words, in contrast to the model that the authors use, my model is not already ruled out by astrophysical constraints. The relevant aspects of the argument however are quite similar, thus the similar conclusions: If you take into account Planck length effects, it becomes more difficult to squeeze matter together to form a black hole because the additional space-time distortion acts against your efforts. This means you need to invest more energy than you thought to get particles close enough to collapse and form a horizon.

What does any of this have to do with paralell universes? Nothing, really, except that one of the authors, Mir Faizal, told some journalist there is a connection. In the piece one can read:
""Normally, when people think of the multiverse, they think of the many-worlds interpretation of quantum mechanics, where every possibility is actualized," Faizal told "This cannot be tested and so it is philosophy and not science. This is not what we mean by parallel universes. What we mean is real universes in extra dimensions. As gravity can flow out of our universe into the extra dimensions, such a model can be tested by the detection of mini black holes at the LHC. We have calculated the energy at which we expect to detect these mini black holes in gravity's rainbow [a new theory]. If we do detect mini black holes at this energy, then we will know that both gravity's rainbow and extra dimensions are correct."
To begin with rainbow gravity is neither new nor a theory, but that addition seems to be the journalist's fault. For what the parallel universes are concerned, to get these in extra dimensions you would need to have additional branes next to our own one and there is nothing like this in the paper. What this has to do with the multiverse I don't know, that's an entirely different story. Maybe this quote was taken out of context.

Why does the media hype this nonsense? Three reasons I can think of. First, the next LHC startup is near and they're looking for a hook to get the story across. Black holes and parallel universes sound good, regardless of whether this has anything to do with reality. Second, the paper shamelessly overstates the relevance of the investigation, makes claims that are manifestly wrong, and fails to point out the miserable state that the framework they use is in. Third, the authors willingly feed the hype in the press.

Did the topic of rainbow gravity and the author's name, Mir Faizal, sound familiar? That's because I wrote about both only a month ago, when the press was hyping another nonsense story about black holes in rainbow gravity with the same author. In that previous paper they claimed that black holes in rainbow gravity don't have a horizon and nothing was mentioned about them forming remnants. I don't see how these both supposed consequences of rainbow gravity are even compatible with each other. If anything this just reinforces my impression that this isn't physics, it's just fanciful interpretation of algebraic manipulations that have no relation to reality whatsoever.

In summary: The authors work in a framework that combines rainbow gravity with a lowered Planck scale, which is already ruled out. They derive bounds on black hole production using existing data analysis that does not apply in the framework they use. The main conclusion that Planck length effects should suppress black hole production at the LHC is correct, but this has been known since 10 years at least. None of this has anything to do with parallel universes.


Phillip Helbig said...

While it's good to read stuff like this in your blog, you really should write up your criticism and submit it to the same journal in which the paper you criticize appears. Unless more people do this, the quality of journals will decline.

Sabine Hossenfelder said...


This would be an entire waste of time. See, this paper is one of hundreds of papers that have been published on this and similar nonsense, and it is admittedly not even a particularly bad one. Most of the papers on the topic are far worse that that. I have already tried to address these problems by publishing this paper which explicitly rules out all models that give rise to a modified dispersion relation of the same type that the authors use. But look, it doesn't make any difference. The model is ruled out - you'd think that's the end of the story. But that isn't how science works these days. People continue to publish papers on this by just ignoring it. They don't even claim there is something wrong with my argument, they just ignore it and continue with their nonsense.

I have wasted enough time on this. There is something really, really going wrong with theoretical physics and this is only one indication for it.



Phillip Helbig said...

"The model is ruled out - you'd think that's the end of the story. But that isn't how science works these days. People continue to publish papers on this by just ignoring it. They don't even claim there is something wrong with my argument, they just ignore it and continue with their nonsense."

I can sympathize to some extent. :-|

I think that when one writes a paper, one should at least have an idea of other work in the field and even if one disagrees with it, at least acknowledge it and say why one disagrees. Ignoring it is inexcusable. I don't mean missing an obscure reference, but rather indicating that one hardly follows the literature at all.

Sabine Hossenfelder said...


Yes, indeed, I am rather willing to forgive people want to work on this provided that they accurately point out the problems with it. I mean, possibly I am wrong, and maybe there is a way to circumvent my conclusion, but ignoring it is not a way to progress.

It's one thing if they themselves want to waste time on this. But it's another thing if they cause others to waste time (and money) by not telling them the whole story. See, I have had to talk to students who work on related things (not exactly the same thing) and who were never told that there are any problems with this idea. Even though I know for sure their supervisor knows. Even though I have published half a dozen of comments and papers explicitly explaining what the problems are. Honestly, it's things like this that make me want to leave academia. This just isn't research any more this is only sales strategies and networking. Best,


Phillip Helbig said...

"Honestly, it's things like this that make me want to leave academia. This just isn't research any more this is only sales strategies and networking."

When I am rich, I can hire you and you can work on what you want, how you want, with no financial worries! :-)

Sabine Hossenfelder said...

Oh, thank you, that is good to know. Now I just have to wait until your billionaire uncle dies and all will be good ;)

N said...

I'll buy you a small LHC!

Plato Hagel said...

You calculate the amount of energy used in a collision process, and then you count how many particulate expressions give you the energy. If this does not equal, as in the original value, then where did that extra energy go?

The authors of the LHC article are postulating no need for a big bang, but regarding the universe as microscopic examination of where that extra energy is going.

If we consider the existing state of the universe as speeding up, then what state would we assume such energy could have been driven off into those extra dimensions so as to explain the state of the universe. This would be self contained as requiring no before, but always the idea that this current state is reflect in the black hole examination?

An increase or decrease in the microscopic/cosmological expression tied to the state of the universe? Hmmmmm.... this might be an interesting paper, or its already been done?


David Brown said...

"In models with large additional dimensions, the Planck scale, where effects of quantum gravity become important, can be lowered to energies accessible to colliders." String theory with the infinite nature hypothesis gives infinitely many free parameters for manipulation. In the simplest model of string theory with the finite nature hypothesis, there might be three decisive empirical tests. The Gravity Probe B science team says I am wrong — but I say the team members misinterpreted their own experiment. Google "witten milgrom" for more information.

L. Edgar Otto said...

I admire the testimony of your initial choice for research - it says a lot for foreseeing likely areas to explore and grasping or surveying the state of art of the projects. I like the multidimensional idea at a time when it was not the mainstream of speculations - that at ground things are ultimately random, chaotic, and inaccessible. So a lot depends on our initial, shall I say fortune or intuitions.
On the other hand some ideas seem compelling to some, such as a young man in the Milwaukee newspaper who threatened to jump off a bridge to make his theory listened to - that in higher dimensions things like gravity may have greater measures of energy which seems right (so the academics said) but on the face of it was nothing new.
Even when we need a sensible 'No' to some claim or perfection of a model as to what is allowed or ruled out - this shows much wider wisdom.
Detractors do take away from those needing time to think on original and fundamental knowledge (the true PhD) but we do not work in a vacuum. Is it the fault of journalists, academia, economics - who knows? The mediocre can make the same claims as to their positions and call it equal justice. Any errors in the presentation are not necessarily the fault of our authors. Just saying, your contemplations are doing fine between what is balanced for all us as humans. Best.

johnduffield said...

Sabine: good blog. Various newspapers have picked up on this sensationalist nonsensical PhysOrg story about this trash paper. What really irritates me is that readers might say hang on a minute, if the LHC produced a black hole, it could eat the Earth. Then they'll be clamouring for an end to physics experimentation, thinking ill of physics, and pressing for funding cuts.

Uncle Al said...

Local gravitation dilutes via diffusion into multi-universes (containing weaker gravitation, or excesses diffuse back. No contradiction!) Defective postulates exclude corrective experiment contradicting "necessary" theory. Sound work (colonial boor Rutherford) became published mountains (Djerassi, ~1200 publications), then bulging h-indices. Soon, table of contents music videos. Sciences' pander grant funding with scurrilous advertising.

Vacuum is observably not exactly mirror-symmetric toward matter. Fundamentally test exact vacuum symmetry toward matter with small emergent scale (9-atom unit cell) geometric chirality, DOI: 10.5281/zenodo.15107,

Remove vast impactions of theoretic irrelevance by looking at reality whispering outside assumptions. Mercury's orbit was not anomalous, it was diagnostic.

Sabine Hossenfelder said...


These models are actually quite simple and don't have a lot of parameters. There is, basically, the number of extra dimensions and their radius. The default assumption is that they are all compactified on the same radius (d-dimensional torus). If you add to this the generalized uncertainty or rainbow gravity, you bring in another two parameters which are essentially a dimensionless factor in front of the first correction term, and the power of that term in a series expansion. So that makes 4 according to my counting.

You are right of course in that you can make these models infinitely complicated if you want to, and there exist not quite infinitely many but certainly hundreds of variants of this theme, none of which, needless to say, has any observation in support of it. You can compactify each dimension on a different radius, or you can compactify on other geometries, or you can add other branes, have them intersect, or you can give the branes a finite width, or add potentials, or twist these dimensions, or mash up the whole thing with another model, etc. It is a big bubble of nothing and, needless to say, many of my colleagues dislike me for saying that. Though the extra dimension bubble has pretty much burst 5 years ago with the first LHC run - this paper is a latecomer. So there is some sanity left in the community ;)

I know this sounds somewhat dismissive, so let me be clear on one thing. The general idea in either case, extra dimensions and generalized uncertainty, is nice and totally worth study. There are some very interesting papers at the bottom of this paper pile - the problem isn't with these ideas, but with the hundreds of people who then follow and try to jump on board with little thought and little insight. These things blow up for the one and only reason that they get published and then get quoted by other people, and so on, which makes more people think there must be something to it etc. This isn't a problem that is specific to this research area, it's a far more general issue.



Sabine Hossenfelder said...

N: I'm cheap, I'll settle for pen and paper.

John C said...

One has to question the peer review process of the journals if so much objectionable material is being passed through - based on this response, it would appear the referees of this journal were lazy or incompetent or both.

Sophie said...

Thanks for posting this Sabine. The authors and their paper definitely don't do much for the reputation of peer-review and specialised journal editors. I guess writing a criticism paper is like trying to plug a hole in a colander to stop the problems leaking out - something's wrong systemically.

MarkusM said...

Where have these nice comments thumbs gone ???
Me, I'm a bit lazy writing all the time :-)

nicolas poupart said...

Sabine, the hypothesis of the existence of a minimum length is quite strange in light of your last article about "space-time-foam". Thus, the minimum length is not a consequence of a discretized space but a limitation on the existence of a length in a continuum. Unless the minimum length is a property of the field that has nothing to do with the space, but only with the field ; the minimal spatial extension of a field.

L. Edgar Otto said...

nicolas poupart

minimum distance (and minimmum duration, that "length" divided by c) was proposed years ago as a possible way out of unification problems.

Our ideas, mathematically, on what a continuum is has to be much more general than that. It is hard to compute expressions that rapidly rise to very large numbers in combination theory as in rooted and not rooted trees beyond 5. Partition theory is not well understood say in string models. Can we have Graceo-Latin squares at ten and greater? Evidently we can, in this sequence 2 6 10 14 ... does that ring a bell in the electron shell configuration?

Considering there is some evidence a single photon can entangle many atoms both organically and inorganically is the physics not on a much higher, perhaps stranger level? Does what happens in a nucleus rationally connect in multiplicity to the electrons (as a spooky distance or not)? We should be asking, not just by experimental discovery, is there some sort of restraint to how many atoms a single photon may entangle and if limited can we go around this to deeper physics.

It is ok to highlight a problem then make a sober judgement, that too is science, but the limitations on what is distance or some more general idea of foam or more unified connections is likely a limitation of our concepts rather than what seems to be that physical in nature. Otherwise we are trapped in problems of our own making, thus no solutions or answers and nothing new offered (if solutions are possible and we do not yet know in the vast unknowns of it that we still face).

So as a working hypothesis or method we should not despair at an era of great speculation some say has discredited science (in the public's eye) rather we should see it as a new birth of an era of inquiry- and not all that has been achieved will vanish if seen in its evolving place of those who make contributions.

nicolas poupart said...

Edgar, it is ironic that the computer scientist is attached to the original sense of time and length concepts while the physicist just sees computable functions. If only computability remains, it may be necessary to remove the old concepts dating from the eighteenth century. The fact that minimum length is derived from minimum time is interesting because it is the only common concept with computability ; a minimum discrete unit for the existence of a deterministic change (a calculation step or more formally a changes of state in a formalized system or deterministic automaton).

regretacles said...

Can you imagine working for a tenured principal investigator produced by such holes in the peer review process? And what if they don't allow you to exit gracefully once you realize what's going on? Such a PI could come to leave a lengthy trail of dead bodies.

Neil Bates said...

Well, perhaps the attempt will end up turning the whole universe into, of course, Swiss cheese?

Count Iblis said...

Perhaps they need to make your old friend Lubos Motl an editor of PLB :)

Zephir said...

The "mini-black holes" are detected at LHC routinely in form of normal particles and whole the above discussion is nonsensical.

The physicists search for black holes and ignore common neutron stars, they search for gravitons and gravity waves and they ignore the CMBR, they search for extradimensions and ignore all low-distance forces, etc. It has basis in misunderstanding of quantum gravity role in contemporary physics - it's not theory of extreme mass and energy density scales, but the scales BETWEEN quantum mechanics and general relativity.

Henning said...

Reading this made me finish up my (first) post on how uncomfortable I am with the peer review process. Science is supposed to be a self-correcting endeavour. In many instances this process seems to have broken down.

In the medical field (as with the now discredited link between autism and vaccinations) we sometimes see correction coming from the fifth estate, but theoretical physics due to the complexity of the subject matter does not benefit from this kind of scrutiny.

Henning said...

MarkusM, I can see the thumbs in Chrome but not Firefox. Since blogger is owned by Google it isn't too surprising to see some features only work with their own browser (shades of IE/MS - different company same MO).

Sabine Hossenfelder said...

Markus, Henning: The thumbs are a third-party script which I only added very recently to somehow improve the imo outdated blogger comment features. I hadn't checked it with other browsers, sorry.

MarkusM said...

Sabine, Henning,
thanx. None of my browsers shows the thumbs, so I guess the problem is on my side. (I'm still running Windows Vista, maybe that's the problem).

Oh, by the way, yesterday I saw an interesting lecture by Leonard Susskind.
He says (01:35-) that every particle is a black hole. If so, we are producing BHs at accelerators all the time. No reason to worry.

carnivorous_mushroom said...

Physics seems to be in a bad state:

Peer review process not to be trusted.
Nature of reality inssufficiently understood (missing mass problem).
Majority of physicists derive patently absurd beliefs from their models (Everett many worlds interpretation).

Or this:
time travelling bird sabotages LHC by means of reverse chronological causation.

In view of this how can the public have much confidence in the pronouncements of experts that there is nothing to worry about when conditions are created "that have not been seen since the Big Bang" since these same processes "happen all the time even here on earth when high energy cosmic rays strike the upper atmosphere".

nicolas poupart said...

On the question of whether the proton is a black hole with the Planck length in background, read "Quantum Gravity and the Holographic Mass" by Nassim Haramein. This peer reviewed article is a fairly simple article that even me can understand. By cons, regarding Haramein itself, which is a kind of new age guru whose findings would be transmitted by an extraterrestrial knowledge, it poses serious questions about cognitive mechanisms. It is much stranger than Dr. Hossenfelder in a "bad hair day".

L. Edgar Otto said...

nicholas poupart,

thank you for your learned reply to my post and pointing out the irony or paradoxes of computation.

Nassim has his place as a low level new age sacred geometer on a level that such gurus (detectors) promote it as a unified theory... it also full of ironies.

Let us keep in mind he relies a lot on Fuller's conception of four space, he did not explain what Fuller took back (Coxeter the guru of Fuller as a guru.) which was that a sphere was equal in volume to five tetrahedra.

So I imagine that this will not promote progress in the more foundational questions and philosophic questions of modern physics (like the existence of elements beyond say 96). Or any question of gravity relating to curvature models.

If I am permitted a lapse of reductionism as cosmology an informal hobby I see no reason at all we cannot develop something like superluminal space travel by existing physics- that is I cannot dismiss the possibility formally although anyone is free to say I have lost it. But is so hard for me not to be influenced by desires from advertising short of being off grid.

It would take a new class of electronic devices, call them abductors which speculates the essential role of dark energy concepts resolved with path superdeterminism (time-like). This conclusion is a surprise... but it seems clear to me that certain stances of speculation will delay progress in what is likely in future science.

Then again I might be just having another rare bald hair day :-)

Plato Hagel said...


In regard to Susskind......we all knew that right? :)

MarkusM said...

yes (many things are repeatedly said here, sorry). But do you understand it ? I don't and I still think that there's a lot of handwaving there. I knew already that t'Hooft is advocating this view, but now also Susskind. Must be something to it, right ?
I think, I know pretty well what an elementary particle is (representations, roots/weights, etc.) but how these concepts should scale up to a (quantum) black hole, I have no clear idea.
If you know of any good references in this respect, please let me know.

ever heard of Baez's crackpot index ?

Sabine Hossenfelder said...

What he's actually saying is that there might be no clear division between black holes and particles, and I guess we can all agree on this. These aren't black holes like the astophysical ones though that ppl normally talk about. Look, it is easy to see that they would be much smaller than what we know the typically extension of elementary particles is (say, compton wavelength, something like this). What he is really hitting on is the wormhole-connection thing. Please avoid just taking a sentence here or there out of context.

MarkusM said...

"What he's actually saying is that there might be no clear division between black holes and particles, and I guess we can all agree on this."
Sure, because we all don't know. Or do we ?

"These aren't black holes like the astophysical ones ... the wormhole-connection thing."

"Please avoid just taking a sentence here or there out of context."
Shame one me :-)


Sabine Hossenfelder said...


Well, what's a particle? Start right there. You'll see what I mean. Best,


johnduffield said...

Particles are not black holes.

MarkusM said...

very interesting question, indeed.
In this context I would say, "something" that you can get from the vacuum of a quantized field. (By applying a†, you know). Is this what you had in mind ?
But if I apply this to the gravitational field, I would (naively) expect to get gravitons (the related field quanta), but black holes ?

Zephir said...

/*Particles are not black holes*/

Not in general relativity sense. But this sense is not physically relevant anyway, being violated with quantum mechanics heavily. The real black holes aren't pin point singularities, they have volume in the same way, like the alleged miniblack holes, predicted with hyperdimensional gravity models.

Uncle Al said...

"Well, what's a particle?"

Quarks or leptons, all matter fundamentally has quantum spin. Outliers are carefully diminished: Einstein-Cartan, Weitzenböck, Ashtekar-Immirzi separation; parity violations, symmetry breakings, chiral anomalies, baryogenesis, biological homochirality, Chern-Simons repair of Einstein-Hilbert action; dark matter.

Empirical failure is countered by Official disbelief (e.g., The Nonesuch, 1974, Larry Niven). Experiments consistent with postulates confirm postulates, repairing nothing. twist and shout, on a bench top. The worst it can do is succeed.

Ask reality what it is. Religion dictates.

Phillip Helbig said...

"MarkusM, I can see the thumbs in Chrome but not Firefox. Since blogger is owned by Google it isn't too surprising to see some features only work with their own browser (shades of IE/MS - different company same MO)."

Yes, one should be on the lookout for this strategy, but I don't think that's the problem here. The thumbs work for me in Firefox.

nicolas poupart said...


For me, the question is not whether a particle is a black hole or not, the question is to understand how the entanglement allows the realization of an effective calculation requiring an exponential amount of information compared to the machine size.

Let as axioms:

A) The problem of integer factorization is not solvable in polynomial time on TM (Turing Machine).
B) The universe is Turing equivalent in absolute complexity (what is computable) and relative complexity (what is computable in polynomial time).

Let as theorem :

C) It is known that it is possible to maintain a large system intricated as a spin entanglement in a crystal. By cons, here the total amount of information decreases because each atom has the same spin.


D) By (A) and (B), the quantum computer is a fiction, therefore it is impossible to maintain the necessary entanglement to the execution of the Shor's algorithm.

E) By (C) and (D), the entanglement necessary for the implementation of Shor's algorithm is unbuildable because the amount of information stored is exponential compared to the size of the system. Therefore, it is necessary to provide a prohibitive amount of energy to keep the entanglement.

F) By (E) the amount of energy required is a function of the amount of information stored in the system and therefore produces an increase of its mass. So there is a new functional relationship between the energy of a system (E), mass (M) and information (I).

G) The approximation of the mass as a quantity of matter therefore force the linear additivity of the latter M and the energy stored in the matter E and the information I. Therefore E = Mc ^ 2 = KI. K is a fundamental constant, perhaps the circular Planck surface.

Obviously, good luck to prove A or B.

Plato Hagel said...

Hi MarkusM,

In the natural setting we want to see what is happening with particle spallations, so using energy values these can be determined.

The original collision, as a value set, is counted with regard to how LHC uses and can calculate with those same energy values. Microscopic is specific, and dissipative as to counting those spallations as particulate expressions


This initial contact in space has consequences as in our back drop measures.

I respect John Baez efforts to categorize, but that index does a disservice to those who actually want to understand the process.


L. Edgar Otto said...

Plato Hagel,

The problem seems to me in defining a particle is as old as Plato or his Buddhist (scientific reaction to Hindu polytheism) equivalent Narajala. So it is not clear (in the diversity of Buddhist models of the universe) that we are talking here philosophy, religion or physics. Manyworld or Multiverse ideas just raise the same ancient question.

If there is one sphere and it touches another sphere, so Narjala argues, how would it know where it is touching if the sphere were not composed of distinct sub-structure - that it is an irreducible uniform object? A monad or atom so to speak?

nicolas, We suspect if there were a solution to A) or B) or other such NP hard considerations it would be a universal solution. Some things we can solve over an infinity in finite time? But who are we to just treat this as algebra so to characterize nature (or others who think about it) if the concepts of algebra can be mirrored so turned on their head- even if there is no limit to such edifices of ideas of complexity?

We have taken very long to reinvent these wheels. Let us not continue the long debate of repeating the same old data but show where and if it is new as well as if it is healthy science.

If students or inquirers do not want to hear such things or even clearly recognize the problem - is it not a sensible stance that they could not even see the problem as to the state of science? Let us ask what is and what is to be done... same day different stuff, same stuff different day... Yet we live in interesting times.

L. Edgar Otto said...
This comment has been removed by the author.
nicolas poupart said...

@ L. Edgar Otto

The question of the intrinsic value of an axiom regardless of the considerations of completeness and consistency is not applicable for the logicians. For cons, the idea that the universe could escape the constraints of logic is questionable if we consider the logic as pure transcendental, that is to say, constraining not only this universe but all possible universes.

If I adopt the view of the universality of physics, mathematics (calculability) should be based on Boolean logic and the axiom of factorization in polynomial time; from an aesthetic point of view it is not only horrible, it's despicable. The change from an universe who uses a single logic gate (xor or nand) to a universe that requires this protuberance is an abomination. I am willing to offer to you a logic gate who can generating a pure random bit but I can not go further.

This is indeed a highly primitive philosophical question ; here an appropriate response in a story in Nietzsche's style for Passover.

Zarathustra stood up on the summit of the mountain and addressed the people:

The gods roamed the world since thousands of years in search of the truth, some sought in the fire and earth, others in the ideas and reason.

One day, a god of fire and earth took the fire in one hand and earth with the other and clapped his hands and told to the other gods amazed: Here is the being, it's fire and it's earth, it's fire or it's earth.

Another day a god of ideas and reason has captured and locked the reason in a box and told to the other gods amazed: Here is the being, it's or it's not, that's all.

The gods of ideas and reason found the thought and the conscience and they had put them into the box, they also created complicated ideas that could not get inside, the ideas of unreasonable sizes.

The gods of fire and earth conceived a variety of objects, all most fabulous as each other but they also created horrible monsters like the fire of hell and frightening ideas like the army of cats from graves, the zombie cats.

One day the gods of fire and earth said to the gods of the ideas and reason you are wrong, there is something that does not come from the being who is or is not. The gods of ideas and reason looked at the sky and shuddered at the thought that the reason can escape from the box.

Then a gentle breeze sprang up and the gods looked at each and they told, if we all had reason, if being is or is not, if the being is fire and earth, fire or earth. Then the ground shook and the army of cats from the graves disappeared into the darkness of oblivion.

All the gods looked up to heaven and they said, "Finally, we catch you truth." Then they formed a circle by holding hands and have uttered this prayer, "O Lord, we accept our incompleteness and thank you for the existence".

Thus spoke Zarathustra