Naturalness, according to physicists. |

Before the LHC turned on, theoretical physicists had high hopes the collisions would reveal new physics besides the Higgs. The chances of that happening get smaller by the day. The possibility still exists, but the absence of new physics so far has already taught us an important lesson: Nature isn’t natural. At least not according to theoretical physicists.

The reason that many in the community expected new physics at the LHC was the criterion of naturalness. Naturalness, in general, is the requirement that a theory should not contain dimensionless numbers that are either very large or very small. If that is so, then theorists will complain the numbers are “finetuned” and regard the theory as contrived and hand-made, not to say ugly.Technical naturalness (originally proposed by ‘t Hooft) is a formalized version of naturalness which is applied in the context of effective field theories in particular. Since you can convert any number much larger than one into a number much smaller than one by taking its inverse, it’s sufficient to consider small numbers in the following. A theory is technically natural if all suspiciously small numbers are protected by a symmetry. The standard model is technically natural, except for the mass of the Higgs.

The Higgs is the only (fundamental) scalar we know and, unlike all the other particles, its mass receives quantum corrections of the order of the cutoff of the theory. The cutoff is assumed to be close by the Planck energy – that means the estimated mass is 15 orders of magnitude larger than the observed mass. This too-large mass of the Higgs could be remedied simply by subtracting a similarly large term. This term however would have to be delicately chosen so that it almost, but not exactly, cancels the huge Planck-scale contribution. It would hence require finetuning.

In the framework of effective field theories, a theory that is not natural is one that requires a lot of finetuning at high energies to get the theory at low energies to work out correctly. The degree of finetuning can, and has been, quantified in various measures of naturalness. Finetuning is thought of as unacceptable because the theory at high energy is presumed to be more fundamental. The physics we find at low energies, so the argument, should not be highly sensitive to the choice we make for that more fundamental theory.

Until a few years ago, most high energy particle theorists therefore would have told you that the apparent need to finetuning the Higgs mass means that new physics must appear nearby the energy scale where the Higgs will be produced. The new physics, for example supersymmetry, would avoid the finetuning.

There’s a standard tale they have about the use of naturalness arguments, which goes somewhat like this:

1) The electron mass isn’t natural in classical electrodynamics, and if one wants to avoid finetuning this means new physics has to appear at around 70 MeV. Indeed, new physics appears even earlier in form of the positron, rendering the electron mass technically natural.

2) The difference between the masses of the neutral and charged pion is not natural because it’s suspiciously small. To prevent fine-tuning one estimates new physics must appear around 700 MeV, and indeed it shows up in form of the rho meson.

3) The lack of flavor changing neutral currents in the standard model means that a parameter which could a priori have been anything must be very small. To avoid fine-tuning, the existence of the charm quark is required. And indeed, the charm quark shows up in the estimated energy range.

From these three examples only the last one was an actual prediction (Glashow, Iliopoulos, and Maiani, 1970). To my knowledge this is the only prediction that technical naturalness has ever given rise to – the other two examples are post-dictions.

Not exactly a great score card.

But well, given that the standard model – in hindsight – obeys this principle, it seems reasonable enough to extrapolate it to the Higgs mass. Or does it? Seeing that the cosmological constant, the only other known example where the Planck mass comes in, isn’t natural either, I am not very convinced.

A much larger problem with naturalness is that it’s a circular argument and thus a merely aesthetic criterion. Or, if you prefer, a philosophic criterion. You cannot make a statement about the likeliness of an occurrence without a probability distribution. And that distribution already necessitates a choice.

In the currently used naturalness arguments, the probability distribution is assumed to be uniform (or at least approximately uniform) in a range that can be normalized to one by dividing through suitable powers of the cutoff. Any other type of distribution, say, one that is sharply peaked around small values, would require the introduction of such a small value in the distribution already. But such a small value justifies itself by the probability distribution just like a number close to one justifies itself by its probability distribution.

Naturalness, hence, becomes a chicken-and-egg problem: Put in the number one, get out the number one. Put in 0.00004, get out 0.00004. The only way to break that circle is to just postulate that some number is somehow better than all other numbers.

The number one is indeed a special number in that it’s the unit element of the multiplication group. One can try to exploit this to come up with a mechanism that prefers a uniform distribution with an approximate width of one by introducing a probability distribution on the space of probability distributions, leading to a recursion relation. But that just leaves one to explain why that mechanism.

Another way to see that this can’t solve the problem is that any such mechanism will depend on the basis in the space of functions. Eg, you could try to single out a probability distribution by asking that it’s the same as its Fourier-transformation. But the Fourier-transformation is just one of infinitely many basis transformations in the space of functions. So again, why exactly this one?

Or you could try to introduce a probability distribution on the space of transformations among bases of probability distributions, and so on. Indeed I’ve played around with this for some while. But in the end you are always left with an ambiguity, either you have to choose the distribution, or the basis, or the transformation. It’s just pushing around the bump under the carpet.

The basic reason there’s no solution to this conundrum is that you’d need another theory for the probability distribution, and that theory per assumption isn’t part of the theory for which you want the distribution. (It’s similar to the issue with the meta-law for time-varying fundamental constants, in case you’re familiar with this argument.)

In any case, whether you buy my conclusion or not, it should give you a pause that high energy theorists don’t ever address the question where the probability distribution comes from. Suppose there indeed was a UV-complete theory of everything that predicted all the parameters in the standard model. Why then would you expect the parameters to be stochastically distributed to begin with?

This lacking probability distribution, however, isn’t my main issue with naturalness. Let’s just postulate that the distribution is uniform and admit it’s an aesthetic criterion, alrighty then. My main issue with naturalness is that it’s a fundamentally nonsensical criterion.

Any theory that we can conceive of which describes nature correctly must necessarily contain hand-picked assumptions which we have chosen “just” to fit observations. If that wasn’t so, all we’d have left to pick assumptions would be mathematical consistency, and we’d end up in Tegmark’s mathematical universe. In the mathematical universe then, we’d no longer have to choose a consistent theory, ok. But we’d instead have to figure out where we are, and that’s the same question in green.

All our theories contain lots of assumptions like Hilbert-spaces and Lie-algebras and Haussdorf measures and so on. For none of these is there any explanation other than “it works.” In the space of all possible mathematics, the selection of this particular math is infinitely fine-tuned already – and it has to be, for otherwise we’d be lost again in Tegmark space.

The mere idea that we can justify the choice of assumptions for our theories in any other way than requiring them to reproduce observations is logical mush. The existing naturalness arguments single out a particular type of assumption – parameters that take on numerical values – but what’s worse about this hand-selected assumption than any other hand-selected assumption?

This is not to say that naturalness is always a useless criterion. It can be applied in cases where one knows the probability distribution, for example for the typical distances between stars or the typical quantum fluctuation in the early universe, etc. I also suspect that it is possible to find an argument for the naturalness of the standard model that does not necessitate to postulate a probability distribution, but I am not aware of one.

It’s somewhat of a mystery to me why naturalness has become so popular in theoretical high energy physics. I’m happy to see it go out of the window now. Keep your eyes open in the next couple of years and you’ll witness that turning point in the history of science when theoretical physicists stopped dictating nature what’s supposedly natural.

## 74 comments:

Interesting - we're looking forward to your book.

Ok, we don't know how big a role naturalness plays in physics. As you say - we know nothing about the probability distribution. But I suspect you would agree that a theory with fewer arbitrary constants is preferable? And if an arbitrary constant takes the value one then it effectively disappears (because multiplying by one is the same as not doing anything). So by saying we expect our final theory to have fewer arbitrary constants is surely the same thing as saying we expect our constants to be natural. Basically, a simpler and less arbitrary theory is more natural. Now, maybe nature isn't like that. Maybe nature is less simple and more arbitrary, in practice. We don't know. But science has generally progressed by finding simpler theories, and eliminating epicycles and strange arbitrary features. Maybe at the lowest level there is a certain amount of arbitrariness that we can never eliminate. There probably is.

However, for some of our unnatural constants (for example, the cosmological constant) I suspect it's just the case that our theories are just not good enough yet.

Naturalness, in general, is the requirement that a theory should not contain dimensionless numbers that are either very large or very small. If that is so, then theorists will complain the numbers are “finetuned” and regard the theory as contrived and hand-made, not to say ugly.In cosmology, one often hears this as an argument against dark energy being the cosmological constant, because in "natural" units its value would have to be very small. On the other hand, there are many papers about "cosmic coincidences" (in fact, I read one this morning) which claim that it is unnatural if two quantities are roughly equal (say, the matter density and dark-energy density). One can't have it both ways.

If in some sense constants of nature are random numbers, then I would expect very large (or very small) numbers to be the norm and roughly equal ones to be those needing explanation.

Very interesting post! However, the other (much more interesting) possibility is that Effective Field Theory (invoked for Higgs hierarchy problem) and General Relativity (invoked in cosmological constant problem), are not fundamental.

Hello Sabine,

I think I understand your point.

But isn't any solution always a combination of finetuning and naturalness simultaneously ? I mean what predictions could you make without any naturalness. And on the other hand we cannot know everything on THE fundamental level, hence some degree of pre-finetuning. An example would be Newton's law g = GM/rsq. Beautiful naturalness (generalized by Einstein), but a finetuning factor G because we don't know what determines it. So doesn't it remain important to keep naturalness as a valuable goal ?

Best, Koen

Niayesh,

Well, that effective field theory holds (in the sense of the scales decoupling) is kind of the same as saying it's natural.

Andrew,

Simplicity is distinct from naturalness. Simplicity is about the number of assumptions. Naturalness is about how justified the assumptions themselves are. Of course I would agree that from two theories that achieve the same the one with fewer parameters is preferable. But I don't know why a more fundamental theory must necessarily have fewer assumptions than a less fundamental one. Sure that would be nice. But what does it matter what I think is nice? That's another aesthetic criterion which is widely used and which might similarly just be wrong. Best,

B.

Phillip,

What is or isn't "the norm" for a number depends on the probability distribution...

In any case, what you say about the coincidences isn't really a disagreement. The reason is that a small number like the cc can be achieved by an almost-cancellation between two large numbers, which is exactly the kind of coincidence that supposedly is not desirable.

What you say is right to the extent that if you take a random distribution over the real numbers, the probability that you get an infinitely large number is one. Hence, we should conclude all bare parameters in our theories are infinitely large. Best,

B.

Koen,

Naturalness is kind of the opposite of finetuning. The two belong together.

I don't know what you mean by "What prediction can you make without any naturalness?" You could say "What prediction can you make without any finetuning?" That's indeed a good summary of my argument with Tegmark's mathematical universe. Best,

B.

Need to take the Unger/Smolin thesis to heart. Actual history trumps the math!

"In any case, what you say about the coincidences isn't really a disagreement. The reason is that a small number like the cc can be achieved by an almost-cancellation between two large numbers, which is exactly the kind of coincidence that supposedly is not desirable."True, but this

assumesthat the smallness is due to some sort of cancellation, but there is no proof of this.I think that Weinberg was the first to suggest this: the particle-physics vacuum energy is huge, but in the multiverse there will be a range of values of the "bare" cosmological constant, including negative ones, so in some the cancellation will be what we observed, and in most others life as we know it would be impossible, so there is a weak-anthropic explanation.

If, instead of Weinberg, say, Max Tegmark had suggested this, I think that it would be viewed much more sceptically. Note that I am not saying that it should be. If the particle-physics vacuum energy really is as large as the claims make it out to be (which is far from clear), this is probably the best explanation.

"and that’s the same question in green"This might confuse some readers who don't understand German, or don't think of German when they read English.

"Das gleiche in grün", "the same thing in green", is something which only superficially appears to be different. Sort of the opposite of "same same but different". :-) Maybe "same difference" would be reasonably close in terms of feeling, but doesn't grammatically fit into the discussion of a question.

Phillip,

I'm aware the saying doesn't exist in English, but I thought it would go through as a joke.

I am intrigued by Phillips statement "True, but this assumes that the smallness is due to some sort of cancellation, but there is no proof of this." In QFT I have seen examples where things like charge and mass receive additive correction, but the fields receive a multiplicative correction. Is it possible that the (loop?) correction to the vacuum energy is something other than additive, i.e. instead the correction ends up being a multiplication by something like 10^-120. Lining up three digits (+/- 120) seems much more natural lining up 120 decimal places for an additive correction.

JR

The practical inference from that interesting discussion of naturalness seems that new semi-empirical phenomenology in particle physics is basically justified, as it was historically in e.g. the development of quantum mechanics.

Sabine,

There is a flaw in your argument. You speak of "Hilbert-spaces and Lie-algebras and Haussdorf measures" and state that "the selection of this particular math is infinitely fine-tuned already". But you have no way to know this. It might not be possible to construct a viable physics using other mathematical structures. No-one has constructed a complete, alternative viable physics (it's hard enough to try to describe ours!). If there are no, or even a few, alternative viable structures, then clearly we could not say that ours was "infinitely fine-tuned".

Of course, it may turn out that there are very many other viable structures. However, we don't know that, and hence your statement about our math being infinitely tuned does not follow.

(I made essentially the same point in http://backreaction.blogspot.ca/2013/12/the-finetuned-cube.html but had no reply.)

Jim.

I'm going to steal this: "...lost again in Tegmark space" :)

Has someone tried to make a model relating cc to the primordial inflaton field? Inflaton field's job is done when the exponential expansion is completed. Then the remnant field may appear as a small cc which we now see and think that to be a constant. After all cc also results in accelerated expansion admittedly not exponential. So it is not too much of an extrapolation to assume that the two, cc and inflaton may be related.

One problem is that mathematics is not "natural". There are all sorts of "just so" numbers like pi, gamma and e. There are also numbers like 196,884 and 21,296,876 that show up again and again in various forms. For example, there's a lot of moonshine linking modular forms and monster groups based on the fact that 196,884 = 196,883 + 1. These numbers are built into the structure of mathematical reasoning, but in some ways they are empirically determined, just like physical constants.

I agree there is something troubling about physicists expecting physical constants to be natural. Surely, a lot of physics flows from mathematical reasoning which is thickly larded with unnatural numbers. Then, there is the whole issue of contingency and empirical measurement. There may be many possible physics, but there is also the physics we've got.

Phillip,

"True, but this assumes that the smallness is due to some sort of cancellation, but there is no proof of this. "Yes, but if you buy into the effective field theory framework, it's either a finetuned cancellation or you need some mechanism to protect the small number - which would make it natural.

I don't disagree with you - see my remark about a theory of everything. If there was a way to calculate that exact cancellation, why should one think of these numbers as having a probability distribution at all?

The reason people run into this problem is the same that gives rise to the string-theory landscape: they're trying to explain too much with too little. If anyone could find a vacuum that gives rise to the SM and that would work, then the rest of the landscape would go out the window tomorrow.

But yes, if you buy into the landscape idea then the only selection principle you're left with is anthropic. Best,

B.

Lipmanov,

Phenomenology is certainly justified. But please note that the VAST majority of phenomenological models in high energy physics today use naturalness as a guiding principle. Best,

B.

Jim,

Sorry that I missed your previous comment. You seem to mistake "finetuned" in the way I have used it here to mean "finetuned for life". This is not what I mean. With finetuned I mean here that it's not natural, according to the definition in my post. Best,

B.

kashyap,

Yes, there are many such attempts to reuse the inflaton field to give rise to a cc later. One of the earliest, I think, is quintessential inflation. Best,

B.

Regarding the cc, it's worth noting that the causal sets people have a good perspective on this, going back some years; here's one example by Sorkin and coauthors:

https://arxiv.org/pdf/astro-ph/0209274v1.pdf

More generally, replacing continuous models with discrete models often brings very large finite numbers into the picture, which is a priori interesting if one is looking for reasons why very large or very small numbers appear in nature.

> "The electron mass isn’t natural in classical electrodynamics, and if one wants to avoid finetuning this means new physics has to appear at around 70 MeV."

Can u give me a reference to a detailed presentation of this argument? Never heard about it...

Maurice,

See eg section 6 of this paper or the introduction of this paper.

Thanks Bee for the quintessential reference.I have a feeling (completely non scientific!) that it must have some serious problem otherwise it would have been accepted or at least widely mentioned!

Thanks a lot. I do not see why this is an example for a problem with naturalness. How can the original problem, the fact that in classical electrodynamics the mass of the electron is calculated to be at least 70 MeV (assuming the current experimental upper limit on the electron radius), be resolved by fine-tuning without introducing new physics? Fine tuning of what? And with what characteristic energy scale is the new physics that does resolve the original problem, quantum electrodynamics, associated? This is really a question to the original authors of the argument, but as you repeated the argument, I allow myself to ask u.

Maurice,

In classical electrodynamics the mass of the electron is infinite because of the self-energy contribution. If you assume that the electron has a finite size, then the contribution to the self-energy becomes finite but dependent on the size. Assuming that this self-energy contribution should not vastly exceed the observed mass leads to the estimate that no later than at 70 MeV something new must happen. You could fix that problem by just choosing the bare mass to almost-but-not-exactly cancel the correction term, but that would require finetuning. Hence it's similar, in spirit, to the naturalness arguments in qft.

It is, as you point out, technically different because its a problem with the classical theory. (In which it actually doesn't make much sense to relate a distance to an energy because that requires an hbar, but then I just wanted to mention that that's the tale everybody tells. If you do some googling, you will see that these examples have been repeated in more than a dozen other publications on the topic, not to mention seminar slides.) It's also historically not how things developed because it wasn't well understood in the early days of QED how to deal with the infinities that quantum field theories themselves bring in. (See references in above mentioned references.) Best,

B.

kashyap,

I think it's not so much that it has a particular problem, but that it lacks a particular motivation. So you have the inflaton and it decays almost but not totally and leaves behind a field that just about makes the cc. I don't actually doubt you can do that. You can probably do it in many different ways with many different fields. But what's the point? Do you really do anything besides making inflation and LambdaCDM more complicated? Best,

B.

Kaleburg: "One problem is that mathematics is not "natural". There are all sorts of "just so" numbers like pi, gamma and e."

Those examples you give *are* natural because they are of the same order of magnitude as one.

In fact, the vast majority of mathematical constants used in physics are natural:

https://en.wikipedia.org/wiki/Mathematical_constants_and_functions

OK, so it is only Giudice's "new physics sets in before energy..." and your "new physics appears even earlier..." that is mistaken. Otherwise it is a good analogy. The mistake is not that classical electrodynamics contains no \hbar (the energy is related to distance (radius) even in the classical electron), but that the transition to quantum electrodynamics affects all scales.

I am intrigued by Phillips statement "True, but this assumes that the smallness is due to some sort of cancellation, but there is no proof of this." In QFT I have seen examples where things like charge and mass receive additive correction, but the fields receive a multiplicative correction.Barrow and Shaw have a paper where they point out that the square of the age of the universe in Planck units is the famous 10**120. I am not making this up. They predict the value of lambda+Omega to be slightly more than 1, so it's a testable prediction.

Not that just the value is 10**120, which is obvious, but that it explains the cosmological-constant problem.

I don't pretend to understand the details of their arguments.

It didn't appear on 1 April.

Just look for papers by John D. Barrow (the famous popular-science writer and, of course, cosmologist) and Douglas Shaw on arXiv.

Dear Sabine,

Looking forward to reading your book !

Can I hope to read it around Christmas, or even Easter 2017 ?

Thanks for writing all this marvelous stuff !

Denis

Thanks again for the reply. But don't you think that will solve the problem of naturalness as far as cc is concerned.Some fields are strong, some fields like gravity are extremely weak in our universe at this time and age of universe. So what? What am I missing?

Which is more or an apparent *mathematic* trap - Borges' Library of Babel, or Tegmark space?

In my opinion, naturalness arguments cannot be easily dismissed. One still needs to convincingly explain what protects the weak scale against quantum corrections and how come the clustering principle worked so well up to this point. Among other issues, one also needs to explain the strong CP problem without resorting to the Peccei-Quinn mechanism.

I think that the naturalness debate will eventually unveil serious foundational problems in our current interpretation of low-energy QFT and the Standard Model:

http://philsci-archive.pitt.edu/11529/1/Naturalness,_the_Autonomy_of_Scales,_and_the_125_GeV_Higgs.pdf

Sabine: "You seem to mistake "finetuned" in the way I have used it here to mean "finetuned for life"."

Well, it's very hard to discuss these matters without verging into anthropic territory!

At the more concrete level, asking about the degree of fine tuning for a single physical parameter, eg Lambda, is relatively straightforward, although still subject to considerable ambiguity in the choice of measure (eg, do we choose our distribution to be roughly flat in linear or log Lambda, and why?). But surely the "measure problem" for mathematical structures will be far worse still. So it's not clear to me that statements about extreme fine tuning (by your definition) of our current structures (Hilbert-spaces, Lie-algebras, Haussdorf measures, etc) have much basis.

Of course the real problem is that we cannot speak about distributions without some kind of ensemble, which leads to anthropic/landscape arguments. I think your point, Sabine, is that, *in the absence* of such ensembles, probability distributions and hence the notion of extreme fine tuning and naturalness don't exist. We just have what we have. I would argue that anthropic arguments may be our only hope to avoid that scenario.

"You cannot make a statement about the likeliness of an occurrence without a probability distribution. And that distribution already necessitates a choice."

Well then, why not make a statement about the likeliness of your initial choice, and a statement about the likeliness of the likeliness of your initial choice, and a statement about the likeliness of the likeliness of the likeliness of your initial choice, and . . .

And Max Tegmark would eliminate infinity from physics - spoilsport!

http://philsci-archive.pitt.edu/9707/1/Infinite-order_probabilities.pdf

Just as "correlation does not imply causation" is overused relative to the often useful principle that a correlation generally has some cause, the more useful idea to have drawn from the "naturalness" concept, might be not that we should expect Nature to be natural, but that if Nature seems unnatural, that this implies that there is some principle of Nature related to the "unnatural" portion of the theory which we do not understand properly or from the right perspective.

Rather than making predictions with it, unnaturalness should be a signal that we need to look elsewhere for answers.

Riemann Zeta(-1) = -1/12 ~ 1 + 2 + 3 + 4 + ...... is "quadratically" divergent.

Zeta(-3) = 1/120 ~ 1^3 + 2^3 + 3^3 + 4^3 + .... is "quartically" divergent.

Is it possible that in the wrong frame of mind, we might imagine that we have a fine-tuning problem here?

Denis,

Make that Christmas 2017 or Easter 2018. Even if I finish writing the book this year (it might happen - going well so far), it will take some time to trickle through the system. Best,

B.

kashyap,

I don't understand your comment. I don't even know what problem with the cc you are referring to... Best,

B.

Ervin,

I agree on cluster decomposition. But please then, humor me, why don't you think naturalness arguments are aesthetic? Saying "in my opinion" isn't particularly convincing. Best,

B.

Jim,

That's right, the measure problem exists also in math space, but that was a different part of my argument. I said, ok, let's just pick some measure then, and pointed out that even if you do that naturalness is still nonsense. The finetuning here has really nothing to do with the existence of life. This is a totally red herring. Physicists generally have very little to do with "life". This is finetuning to explain observations. My argument about finetuning in math space is to say that you can't do science otherwise. Best,

B.

Wes,

Yes, you can do that - I wrote that in my post. It leads to a recursion relation if you want this to converge. But it doesn't solve the problem. Best,

B.

Bee: "But please then, humor me, why don't you think naturalness arguments are aesthetic? Saying "in my opinion" isn't particularly convincing."

Well, I don't think it's just an aesthetic because I think it has shown to be a useful guide. As an example, the formula for the period of a pendulum. Just from dimensional analysis, you can work out the period must be proportional to the square root of the length of the rope divided by g:

https://leepavelich.wordpress.com/2011/09/23/fun-with-dimensional-analysis-1-simple-pendulum/

And that's virtually the final formula because all you need extra is to multiply by the proportionality factor of 2*pi - which is of the order of magnitude of one, and is therefore natural. You can assume that any additional numerical factor is going to be natural. So naturalness basically gives you the final formula.

I don't think that's just an aesthetic - that's a useful tool.

Sabine,

"In my opinion" simply reflects the fact that naturalness is an unsettled issue and there are divergent viewpoints on how to deal with it.

Andrew,

Well that's because there are no other constants in the equations. I don't know what your point is: If there are only numbers of order one in the equations then dimensional analysis works well up to numbers of order one? That's got nothing to do with technical naturalness, which is about the question what numbers are (or aren't) in the equations to begin with. Best,

B.

I guess what I'm saying is that there *is* something special about numbers of order one - because that's the case for so many fundamental mathematical constants. And as we get to more fundamental levels, these are likely to be the numbers we encounter.

Thing is, if you find a number like 20 million in any fundamental equation, then I'd be asking where the heck did that come from, and suspect my theory was missing something fundamental. But if I get the number 5 appearing in my equation, then that makes me think my logic is basically correct. Because I'm just missing a 2*pi term, or similar.

For particle physics, well, maybe naturalness is not applicable. But I do think naturalness is more than just saying "that looks nice - I like that". I think there is a logic behind it.

Andrew,

The problem with your argument is this:

"as we get to more fundamental levels, these are likely to be the numbers we encounter"

There is absolutely no reason why that should be the case.

Besides this, I'm not even sure the premise is right. Take the example that Kaleberg mentioned above, the dimensions of the smallest representations of the monster group. How about 21296876 or 842609326. Are these small numbers? Well they are only one mathematical argument away from the numbers 3 and 4.

I actually suspect that the reason we mostly encounter mathematical constants of order one is that these are the first ones we'd encounter as we increase our knowledge of math. Ask back again in 2000 years and the situation might look entirely different. We're only just beginning to understand complex and chaotic systems. There's a lot of math left to explore. And either way, there's no proof for the case you are trying to make.

Best,

B.

Thanks for your responses, Bee. Yes, I suspect naturalness might not be applicable for particle physics.

I'm afraid an alternative might be that we just never discover where these numbers come from.

Sorry to keep on badgering the same point over and over again without knowing the details of quintessence theory! I personally find it appealing that inflaton and cc may be related. What I was asking was that, does quintessence help with the problem of naturalness of cc which is one of the points under discussion. Naively there are strong fields like strong interactions and there are extremely weak fields like gravity, simultaneously existing in our universe at this time.So if quintessence is right and today's quintessence field is very small, what is the problem? Wouldn't that remove at least one of the difficulties with naturalness? What am I missing?

I'm glad you pointed of that there are very many other criteria for naturalness besides the existence of small numbers. Where it is common to say, as you have, that the SM is natural except for the Higgs mass, I would say that the SM is natural except for the Higgs not having any spin. Every other fundamental particle, nearly 20 of them, have non-zero spin so what the heck is this Higgs guy doing in there being a spinless fundamental particle? It certainly doesn't seem "natural" to me.

Andrew, At the risk of sounding like Peter Woit I would say that we have zero chance to discover where these numbers come from, unless we get off the anthropics/multiverse bandwagon.

JR

Andrew wrote:

If you find a number like 20 million in any fundamental equation, then I'd be asking where the heck did that come from, and suspect my theory was missing something fundamental. But if I get the number 5 appearing in my equation, then that makes me think my logic is basically correct. Because I'm just missing a 2*pi term, or similar.That's interesting. Many formulas of physics contain factors proportional to the surface area of a unit sphere, or (equivalently) the solid angle surrounding a point, so we find "natural" constants like 4*pi (~12), but this gives "natural" constants only because of the low number of space dimensions. In a space of (say) 100 dimensions, the surface area of a unit sphere is about 1/10^38. If we ever found, on some scale, space actually has a hundred dimensions or so, the laws of physics on that scale would involve extremely "un-natural" constants like 10^38.

@Andrew Thomas The pendulum equation lacks its bob. Galileo said "universal gravitation." Newton opens

Principiaby demanding mass and weight are in constant proportion. Special becomes general relativity given the Equivalence Principle (EP), that all local bodies in vacuum free fall pursue identical minimum action parallel-displaced trajectories.Classical gravitational theories offer testable examples of EP violation. Parity violations, symmetry breakings, chiral anomalies, Chern-Simons repair of Einstein-Hilbert action; failure to quantize gravitation, detect dark matter, and validate SUSY; are each and all consistent with the answer physics is forever disobliging to test.

Nil sapientiæ odiosus acumine nimio, Seneca, re Poe'sThe Purloined LetterAmos, that's a really good point. John Barrow also makes that point in his book The Constants of Nature. But I think it ignores one crucial factor ...

... the number of dimensions of space (3) is itself a natural number, a small dimensionless constant. So we find natural constants because space is natural. If space was unnatural, yes, we would find unnatural constants - but that is not the case - naturalness wins.

Perhaps we are being a bit anthropocentric in our definition of natural. A creature who could envision a collection of 100,000 objects as easily as we can envision 5 might say that a physical model was natural as long as the constants were all <1,000,000.

On a more speculative note, I've often felt that the reason we see so much order in physical laws and connections with mathematics is because we are stuck in a very low-intelligence state - we have only a few hundred concept pigeon-holes, so of course there are collisions. Very intelligent creatures may see so many unrelated concepts that connections between them are seen as fleeting coincidences.

All useless as far as how we should proceed. We're us, and not speculative super-intelligences. But as flatworms at the Opera, we may never understand the plot.

Andrew wrote

The number of dimensions of space (3) is itself a natural number, a small dimensionless constant. So we find natural constants because space is natural. If space was unnatural, yes, we would find unnatural constants...I'm not sure if the number 100 is considered "unnatural". (Is the fine structure constant, ~137, considered unnatural?) Also, we don't need 100 dimensions to get extremely tiny geometrical factors. You mentioned before that a factor of 20 million would strike you as unnatural, which corresponds to about 41 dimensions - not too unnatural.

I suppose one could refine the definition of naturalness to say that a dimensionless constant is not considered unnatural if it can be "explained" by, or expressed in terms of, another dimensionless constant that is natural. But that definition makes the concept of naturalness sort of meaningless. It's like saying 10^38 is natural, because the real underlying constant is log(10^38). Naturalness wins! On each level of description, we tend to logarithmically telescope our terms so that we deal with numbers on the order of 1 for convenience.

Amos,

assume all free parameters of the SM related to energy quantities (that is masses and couplings) can be expressed logically from 1, 2, 3, two simplistic geometrical equations, and one real constant. Would that be natural enough?

J.

I think the value of the fine structure constant is generally regarded as a complete mystery. Not "natural".

Considering the difference in strength between gravity and other fundamental forces, it wouldn't surprise me to see "unnatural" numbers pop up.

akidbelle wrote:

assume all free parameters of the SM related to energy quantities (that is masses and couplings) can be expressed logically from 1, 2, 3, two simplistic geometrical equations, and one real constant. Would that be natural enough?Interesting question. I'd be comfortable with the integers 1, 2, and 3, but the "one real constant" would worry me, because a single real-valued number can contain infinite information (e.g., your entire DNA sequence is in the digits of pi). I think we could encode essentially anything into the digits of a single real number. So I'm not sure if I'd call that "natural". Regarding the two "simplistic geometrical equations", I'd probably have to see them to decide, but in general I think there is a

lotof information implicit in any geometrical concept, even things that seem intuitively simple to us.Naturalness in physics is sometimes double-edged. The most clear example is the transition to turbulence. in fluid Mechanics usually the order of magnitude of dimension less parameters much lower tan the unity indicates that the related fiscal process is negligible;usually a factor of 10 works very well but transition to turbulence happens at high Reynolds numbers (about 2000 in a straight tube). This number is far from "natural" from the Navier Stokes equation an seems related with geometrical fine details (tough not as fine as some up the " fine tuning")

Interesting question. I'd be comfortable with the integers 1, 2, and 3, but the "one real constant" would worry me, because a single real-valued number can contain infinite information (e.g., your entire DNA sequence is in the digits of pi). I think we could encode essentially anything into the digits of a single real number.

Amos: "a single real-valued number can contain infinite information (e.g., your entire DNA sequence is in the digits of pi). I think we could encode essentially anything into the digits."

Be careful with how you talk about "information". There's very little information in pi. I could write a program to generate the digits of pi and it would be just a few lines of code. Basically, you could compress pi to just a few bytes - showing it contains little original information. I'd just have to transmit a few bytes to effectively transmit all the infinite digits of pi.

Yes, it might be true that you could find all strings of symbols in the digits of pi. But the information is then represented by knowing what the starting digit would be. That might be an absolutely huge number, and it is in that huge number where the information lies - not in pi itself. You might as well just transmit the huge number and forget about pi.

Maybe, the "natural" - is prime number: 2,3,5,7,11,13,17,19,23...

Maybe, the "natural" - is Benford's law.

https://en.m.wikipedia.org/wiki/Benford%27s_law

Amos,

the theory exists, it is on Progress In Physics titled "on quantization and the resonance paths". If you find some time to look, please let me know what you think of the real number (X/mu therein).

J.

Andrew wrote:

There's very little information in pi.Yes, that little parenthetical reference to pi was misleading. It was just intended to convey an idea of how much "room" there is in a real number to encode information, not to suggest that pi itself is a fully general real number - it obviously has a finite definition. That's why I said a real number

can(not necessarily does) contain an infinite amount of information. The next sentence was "We could encode essentially anything into the digitsof a real number", which is true. (You omitted the words "of a real number" and put a period after "digits", making it appear to be referring to pi, which I agree would not be correct.) The point is that a general real number is not an economically sparse entity, because it need not be finitely specifiable (and the set is not even countable), so claiming that a theory can be represented by a single real-valued constant is not as conceptually "natural" as one might think. Mathematical constructivists even argue that "the real numbers" don't "exist", and we can only talk meaningfully about things that can be finitely specified.We might refine the definition of "naturalness" to mean that the laws of physics should only involve constants that are finitely specifiable, i.e., no general real-valued constant would be accepted as fundamental. For example, the number pi would be okay, because it's finitely specifiable (and this would cover all the geometrical factors, even in 100 dimensions etc. that we discussed earlier, because they are all finitely specifiable, despite being far from unity). But the mass ratios of particles (or the tension of a string) would not be accepted as fundamental unless the values could be finitely specified, rather than taken as general real-valued constants (which potentially entail infinite amounts of information).

Post a Comment