Pages

Tuesday, July 09, 2013

The unshaven valley

A super model. Simple, beautiful, but not your reality.
If you want to test quantum gravity, the physics of the early universe is promising, very promising. Back then, energy densities were high, curvature was large, and we can expect effects were relevant that we’d never see in our laboratories. It is thus not so surprising that there exist many  descriptions of the early universe within one or the other approach to quantized gravity. The intention is to test compatibility with existing data and, ideally, make new predictions and arrive at new insights.

In a recent paper, Burgess, Cicoli and Quevedo contrasted a number of previously proposed string theory models for inflation with the new Planck data (arXiv:1306.3512 [hep-th]). They conclude that by and large most of these models are still compatible with the data because our observations seem to be fairly generic. In the trash bin goes everything that predicted large non-Gaussianities, and the jury is still out on the primordial tensor modes, because Planck hasn’t yet published the data. It’s the confrontation of models with observation that we’ve all been waiting for.

The Burgess et al paper is very readable if you are interested in string inflation models. It is valuable for pointing out difficulties with some of these approaches that gives the reader a somewhat broader perspective than just data fitting. Interesting for a completely different reason is the introduction of the paper with a subsection “Why consider such complicated models?” that is a forward defense against Occam’s razor. I want to spend some words on this.

Occam’s razor is the idea that from among several hypotheses with the same explanatory power the simplest one is the best, or at least the one that scientists should continue with. This sounds reasonable until you ask for definitions of the words “simple” and “explanatory power”.

“Simple” isn’t simple to define. In the hard sciences one may try to replace it with small computational complexity, but that neglects that scientists aren’t computers. What we regard as “simple” often depends on our education and familiarity with mathematical concepts. Eg you might find Maxwell’s equations much simpler when written with differential forms if you know how to deal with stars and wedges, but that’s really just cosmetics. Perceived simplicity also depends on what we find elegant which is inevitably subjective. Most scientists tend to find whatever it is that they are working on simple and elegant.

Replacing “simple” with the number of assumptions in most cases doesn’t help remove the ambiguity because it just raises the question what’s a necessary assumption. Think of quantum mechanics. Do you really want to count all assumptions about convergence properties of hermitian operators on Hilbert-spaces and so on that no physicist ever bothers with?

There’s one situation in which “simpler” seems to have an unambiguous meaning, which is if there are assumptions that are just entirely superfluous. This seems to be the case that Burgess et al are defending against, which brings us to the issue of explanatory power.

Explanatory power begs the question what should be explained with that power. It’s one thing to come up with a model that describes existing data. It’s another thing entirely whether that model is satisfactory, again an inevitably subjective notion.

ΛCDM for example fits the available data just fine. For the theoretician however it’s a highly unsatisfactory model because we don’t have a microscopic explanation for what is dark matter and dark energy. Dark energy in particular comes with the well-known puzzles of why it’s small, non-zero, and became relevant just recently in the history of the universe. So if you want to shave model space, should you discard all models that make additional assumptions about dark matter and dark energy because a generic ΛCDM will do for fitting the data? Of course you shouldn’t. You should first ask what the model is supposed to explain. The whole debate about naturalness and elegance in particular hinges on the question of what requires an explanation.

I would argue that models for dark energy and dark matter aim to explain more than the available data and thus should not be compared to ΛCDM in terms of explanatory power. These models that add onto the structure of ΛCDM with “unnecessary” assumption are studied to make predictions for new data, so that experimentalists know what to look for. If new data comes in, then what requires an explanation can change one day to the next. What was full with seemingly unnecessary assumptions yesterday might become the simplest model tomorrow. Theory doesn’t have to follow experiment. Sometimes it’s the other way round.

The situation with string inflation models isn’t so different. These models weren’t constructed with the purpose of being the simplest explanation for available data. They were constructed to study and better understand quantum effects in the early universe, and to see whether string theoretical approaches are consistent with observation. The answer is, yes, most of them are, and still are. It is true of course that there are simpler models that describe the data. But that leaves aside the whole motivation for looking for a theory of quantum gravity to begin with.

Now one might try to argue that a successful quantization of gravity should fulfill the requirement of simplicity. To begin with, that’s an unfounded expectation. There really is no reason why more fundamental theories should be simpler in any sense of the word. Yes, many people expect that a “theory of everything” will, for example, provide a neat and “simple” explanation for the masses of particles in the standard model and ideally also for the gauge groups and so on. They expect a theory of everything to make some presently ad-hoc assumptions unnecessary. But really, we don’t know that this has to be the case. Maybe it just isn’t so. Maybe quantum gravity is complicated and requires the introduction of 105 new parameters, who knows. After all, we already know that the universe isn’t as simple as it possibly could be just by virtue of existing.

But even if the fundamental theory that we are looking for is simple, this does not mean that phenomenological models on the path to this theory will be of increasing simplicity. In fact we should expect them to be less simple by construction. The whole purpose of phenomenological models is to bridge the gap between what we know and the underlying fundamental theory that we are looking for. On both ends, there’s parsimony. In between, there’s approximations and unexplained parameter values and inelegant ad-hoc assumptions.

Phenomenological models that are not strictly derived from but normally motivated by some approach to quantum gravity are developed with the explicit purpose to quantify effects that have so far not been seen. This means they are not necessary to explain existing data. Their use is to identify promising new observables to look for, like eg tensor modes or non-Gaussianity.

In other words, even if the fundamental theory is simple, we’ll most likely have to go through a valley of ugly, not-so-simple, unshaven attempts. Applying Occam’s razor would cut short these efforts and greatly hinder scientific progress.

It’s not that Occam’s razor has no use at all, just that one has to be aware it marks a fuzzy line because scientists don’t normally agree on exactly what requires an explanation. For every model that offers a genuinely new way of thinking about an open question, there follow several hundred small variations of the original idea that add little or no new insights. Needless to say, this isn’t particularly conductive to progress. This bandwagon effect is greatly driven by present publication tactics and largely a social phenomenon. Occam’s razor would be applicable, but of course everybody will argue that their contribution adds large explanatory value, and we might be better of to err on the unshaven side.

If a ball rolls in front of your car, the simplest explanation for your observation, the one with the minimal set of assumption, is that there’s a ball rolling. From your observation of it rolling you can make a fairly accurate prediction where it’s going. But you’ll probably brake even if you are sure you’ll miss the ball. That’s because you construct a model for where the ball came from and anticipate new data. The situation isn’t so different for string inflation models. True, you don’t need them to explain the ball rolling; the Planck data can be fitted by simpler models. But they are possible answers to the question where the ball came from and what else we should watch out for.

In summary: Occam’s razor isn’t always helpful to scientific progress. To find a fundamentally simple theory, we might have to pass through stages of inelegant models that point us into the right direction.

47 comments:

  1. Hi Bee,

    Interesting take and yet given his proviso I think Albert and you still might have agreed that there is a difference to be found between a destination and the route required to get there.

    "It can scarcely be denied that the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience"

    -Albert Einstein, “On the Method of Theoretical Physics, page 9”

    Regards,

    Phil

    ReplyDelete
  2. A second pro String theory post in a row.

    Who ever took the body of Sabine Hossenfelder should leave it at once!

    ReplyDelete
  3. It's a pro-phenomenology post really, not specifically about string theory. Had the paper been about Loop Cosmology, the same thing could be said.

    ReplyDelete
  4. and requires the introduction of 105 new parameters,

    Hello Bee,
    a Theory with such high goals has to have exactly 42 parameters
    :=)
    Georg

    ReplyDelete
  5. "To find a fundamentally simple theory, we might have to pass through stages of inelegant models that point us into the right direction".

    No, to find a fundamentally simple theory you have to start from the fundamentals. See comment 4 here:

    http://physicsworld.com/cws/article/news/2012/nov/06/highly-charged-ions-could-make-better-atomic-clock#

    ReplyDelete
  6. Even if theoretical by design it still has to match up with what has been gained in terms of the information?

    Forthcoming searches for B-modes, non-Gaussianity and new particles should be decisive.See: Inflationary paradigm in trouble after Planck2013

    Thusly, any dark matter model must provide for some information for what is describing the universe speeding up so there has to be something that can be considered useful?

    ReplyDelete
  7. The successes of inflationary models hinge on the interplay between quantum eff ects (e.g. the source of primordial fluctuations) with classical gravity. Quantum fluctuations are by design large enough to be observable, and the non-trivial interplay between quantum and gravitational dynamics pushes the envelope of the quantum-gravity frontier. See: String Inflation After Planck 2013

    ReplyDelete
  8. a higher dimensional version of the Pringle's potato chip. Brian Greene, The Fabric of the Cosmos, pg 483, Para 2, line 29

    Solving quantum field theories via curved spacetimes by Igor R. Klebanov and Juan M. Maldacena-http://scitation.aip.org/getpdf/servlet/GetPDFServlet?filetype=pdf&id=PHTOAD000062000001000028000001&idtype=cvips

    ReplyDelete
  9. Fashion is metaphors. Mathematics is contingencies. Science is observation, the weakest of the three for being unpleasantly factual. Start with empirical truth, then model. Brans-Dicke scalar-tensor gravitation theories are dead re Physics Today 66(7) 14-17 (2013) (July 2013 issue) and arxiv:1304.6875. "God is a geometer," Plato. Let's assault physics' bikini line by testing it.

    ReplyDelete
  10. That is a very nice article again :-)

    If I am curious to learn a bit about from quantum gravity insired models, for example how they implement dark energy, inflation, dark matter etc is the paper in this article a good start?

    ReplyDelete
  11. Interesting post. One of the problems with plain old theoretical simplicity is that it seems to have an expiration date. It only lasts until your observations reach into new areas and you start experiencing anomalies (in Thomas Kuhn's sense).

    These are really exciting times in Galactic Astronomy. Simulations of galaxy development are producing some really impressive results. The models are powerful enough now that they can help conceptually: "insight through modelling". These models generally presume Lambda CDM cosmology. At the same time, one area in modeling is not going well: It's been very difficult to explain the structure and evolution of dwarf galaxies. There are observational problems as well: there should be a lot more dwarf galaxies around if Lambda CDM is right.
    A number of authors are now starting to play with the idea that there are at least two kinds of dark matter: the cold stuff, and some admixture of warm dark matter. Cold + warm produces dwarfs right, and makes the big stuff work too.

    So which is simpler: Cold, or Cold plus warm? 10 years ago, maybe the first. But now?

    ReplyDelete

  12. Perhaps, none of the above will turn out to be the right answer re dark matter.

    Did you see the Science paper on the mysterious millisecond radio bursts coming from considerable distances isotropically at about 10,000 per day. What if only a tiny fraction of these sources burst on a given day?

    Stellar-mass black holes?

    Hmmmm.

    ReplyDelete
  13. "ΛCDM for example fits the available data just fine. For the theoretician however it’s a highly unsatisfactory model because we don’t have a microscopic explanation for what is dark matter and dark energy. Dark energy in particular comes with the well-known puzzles of why it’s small, non-zero, and BECAME RELEVANT JUST RECENTLY IN THE HISTORY OF THE UNIVERSE "

    Bee, do you care to elaborate on that last part? Seems to me that its likely dark energy has always been relevant, which is quite different from the fact that it has only recently been discovered.

    Also, just as time slows down with positive curvature (I hope I'm getting this right) time should also speed up with negative curvature. Just using that simple logic would mean that if a conscious being was suitably scaled and lived at the beginnings of the universe it would have implications. He/She would use a telescope and looking outward would see everything expanding. But it would also be at a slow rate because just as space was compacted so was time compacted.

    To that person there would be a similar very small observational acceleration even though today those same events look like they were undergoing high acceleration. It makes very much sense to assume there is a form of invariance between time and space at different points in the expansion of the cosmos.

    You could think of it as a kind of symmetry.

    ReplyDelete
  14. Nemo,

    If you're interested in string inflation models in particular, then the paper mentioned in the above post seems like a good starting point, yes (it's not a topic I am very familiar with myself). If you want to get a somewhat broader perspective, I suggest you have a look at section 3.3 of this paper and follow the references therein. Best,

    B.

    ReplyDelete
  15. Eric,

    I was referring to what is known as the "coincidence problem" while trying to avoid using the word. I wrote about the CC problems in more detail here. In a nutshell the question is why the CC is comparable to the average energy density of matter *today*, because the CC is constant but the energy density of matter isn't. At any other epoch in the history of the universe, they wouldn't be the same size. Is this a coincidence or does it tell us something deeper? Best,

    B.

    ReplyDelete
  16. Interesting post, Bee. However, it's not all that easy to get past Occam's Razor. What makes it difficult for most scientists to grasp is the fact that it's really not a scientific principle at all, but an essential part of the philosophy on which science is based.

    The basics are explained in a particularly clear online discussion by Francis Heylighen (http://pespmc1.vub.ac.be/OCCAMRAZ.html):

    "[Occam's Razor] underlies all scientific modelling and theory building. It admonishes us to choose from a set of otherwise equivalent models of a given phenomenon the simplest one. . .

    Though the principle may seem rather trivial, it is essential for model building because of what is known as the "underdetermination of theories by data". For a given set of observations or data, there is always an infinite number of possible models explaining those same data. This is because a model normally represents an infinite number of possible cases, of which the observed cases are only a finite subset. The non-observed cases are inferred by postulating general rules covering both actual and potential observations."

    In other words, we are stuck with Occam whether we like it or not, because it's a basic epistemological principle, necessary to science, due to the "underdetermination of theories by data."

    If one is bothered by the problem of determining what is simple and what is not, then one needs to make a definition of simplicity part of one's thinking.

    ReplyDelete
  17. Bee, good point. What I was trying to say was that its not so easy to define the word "Constant". Is it something that is immutable or is it something like a ratio of two or more things in which the ratio always stays the same, no matter what?

    I think scientists would do well to start thinking about this. If time flow really does dilate as space expands and energy density decreases then the markers for every measurement there is will then change. Even the standard idea that during the inflation epoch space expanded faster than light would have to be revised.

    Einstein showed in SR that time can only be measured by events in relation to standard yardsticks (metersticks). It may be that events, like the absorbion of photons on the eyeballs of an observer that is counting out distance markers, occur many orders of magnitude faster in those earlier epochs when energy was denser than now.

    This isn't really a stretch at all. And it all goes back to Occam's razor. Things seem much more complex and messy when one misses the hidden link right in front of you. It's very tempting to give up and throw out Occam's razor rather than accept that you are missing something.



    ReplyDelete
  18. Great post. I would add another confusion concerning simplicity - eletroweak gauge theory may provide a very nice unification of two separate theories, but it is more, not less complex than the distinct theories of the em or weak interaction.
    In fact, many modern theories in physics have become ever more complex, yet sucessful because they manage to describe ever more phenomena in terms of less independent parameters...

    ReplyDelete
  19. DATA! Explanation! Contradiction! The preceding mass ranges are impossibly low, re Ting/AMSS and dark matter self-annihalation.

    http://arxiv.org/abs/1306.3512 "The problem with making predictions is that people test them." One suspects "simplicity" is not the fundamental problem.

    ReplyDelete
  20. It sounds to me, Bee, that what you seem to have in mind is close to Peirce's principle of "abduction."


    Abduction has been described as "the stage of inquiry in which we try to generate theories which may then later be assessed." At this stage, we are not interested in deciding whether or not our theory is the simplest or most elegant, but we are interested in exploring the terrain in all its complexity.

    By concerning yourself with Occam's razor you might be jumping the gun, worrying too much about whether your thinking will ultimately resolve all the various elements into a single simple concept. That sort of evaluation comes later, when the theory has matured. While it is in development, the proper guide would, I think, be abduction.

    ReplyDelete
  21. /*“Simple” isn’t simple to define*/

    Occam's razor has already defined it. The simplest theory is this one, which provides highest number of testable predictions with least amount of postulates and ad-hoced constants.

    BTW all predictions based on fringe model, like the Big Bang cosmology are fringe as well.

    ReplyDelete
  22. "They expect a theory of everything to make some presently ad-hoc assumptions unnecessary. But really, we don’t know that this has to be the case. Maybe it just isn’t so. Maybe quantum gravity is complicated and requires the introduction of 105 new parameters, who knows. After all, we already know that the universe isn’t as simple as it possibly could be just by virtue of existing."

    Then it's not a TOE. A TOE should not have any free dimensionless parameters...

    Everyting should be computable e.g. the string coupling is determined by the VEV of the dilaton.

    If this is not the case you need another layer to explain these values.

    For example a GUT is not fundamental you need another layer to explain the GUT unification scale of the couplings.

    ReplyDelete
  23. 11 July 2013: "APS Introduces Physical Review Applied.

    The latest member of the Physical Review family is Physical Review Applied, a new journal dedicated to the rapidly growing field of applied physics." "ACK! THBBFT!"

    ReplyDelete
  24. I am not sure, if Ockhams Razor had to be mentioned at all in the paper of Burgess et.al. It is a heuristic tool that *can* be used during the development of theories. However, it is not at all a validator of theories.

    The opposite principle is in the world since the times of Ockham too. Walter of Chatton (1287–1347): "If three things are not enough to verify an affirmative proposition about things, a fourth must be added, and so on."

    If one collects a bunch of theories that are valid according to some experimental data, Ockhams razor will not be useful to select which of these is valid or the best choice to research further. No theory can be falsified with respect to the reality and its experimental data it models at this point.

    The blog article gives good examples for the need of Chattons approach and how it is necessary to add more assumptions that will in their turn put up new requirements for observations. As written in the blog article, the way to the simple truth leads maybe through some complex theories. Ockhams razor is then not more than what one does in everyday's life: solve simple problems first.

    ReplyDelete
  25. Quite off-topic, please indulge.

    The world which is quantum appears classical to us. The billiard ball is described classically presumably because myriads of interactions keep carrying away quantum phase information. Your wi-fi signal can be described classically not because interactions are carrying away phase information, but presumably merely because of the extremely large number of quanta that comprise it.

    Two questions:-
    1. Is there a third way of being classical?

    2. Space-time appears classical because....?

    Thanks in advance!

    ReplyDelete
  26. Arun my two cents:

    In the context of Quantum cosmology (via Wheeler de Witt and all that)equation the classical space time emerges as a result of decoherence due to environmental dof.

    Perturbatively (splitting the metric to flat Mikowski + perturbation) any deviation from flat space-time can be understood as a coherent state of gravitons much like an electromagnetic wave is a coherent state of photons.

    ReplyDelete
  27. Arun,

    Yes, as Giotis says. I think Claus Kiefer wrote some papers on that. Also, quantum effects of gravity are weak, so the classical approximation is good, very good, for all practical purposes. A third way of "being classical"? Well, the examples you name aren't "being classical" they are just well approximated by a classical description for one or the other reason. In that sense you're "being classical" whenever quantum effects are negligible. Eg the gravitational field of planet earth is pretty much classical. Not because there are no quantum corrections, but because these are tiny. Best,

    B.

    ReplyDelete
  28. /*To find a fundamentally simple theory, we might have to pass through stages of inelegant models that point us into the right direction..*/

    This is parazitic stance solely driven with expectation of maximal occupation of scientists involved, as Robert Wilson recognized and named already before many years. The contemporary scientists simply ignore more effective theories from the same reason, which they ignore magnetic motors and/or cold fusion findings: it helps them to prolonge their ineffective research and jobs as much a possible. Every closed sectarian community without public feedback will spontaneously adjust the rules its own existence like selfish meme, i.e. in the way, which suits best to this community, not the people, who are paying it. It applies to community of theoretical physics as well.

    ReplyDelete
  29. Hi Giotis,

    Regarding your earlier comment "A TOE should not have any free dimensionless parameters..."

    A TOE should be able to describe everything we observe, at least in principle. If it necessitates 200 parameters, so be it. Of course you can always hope that maybe there's some principle that explains all these 200 parameters, but a) why bother if you've already explained everything in the universe around you anyway and b) why should we expect this? Best,

    B.

    ReplyDelete
  30. Zephir,

    All you seem to be doing lately is posting ill-informed and ignorant insults of scientists whose work you demonstrably don't understand. You're stretching my patience. I'm very close you putting you on the blacklist in which case all of your comments will be deleted regardless of content. Please try to make your comments more useful, constructive and on-topic. That's the first and final warning. Best,

    B.

    ReplyDelete
  31. /* You're stretching my patience. I'm very close you putting you on the blacklist */

    I'm of course aware of it, but the truth is more important for me. Because the truth is always constructive. For example, we can just ask, why theorists ignore the fifty years old Burkhard Heim's theory, which is able to predict whole mass spectrum of most of particles with high precission with only six-seven parameters. Whereas the Standard Model isn't able to predict anything like this even with 26+ parameters. The Occam razor clearly says, that the Heim's theory is more effective field theory, than the Standard Model and if you really seek the effective ways for understanding of reality, you should follow the more effective theories, not these ineffective ones.

    So, what's your personal reason for ignorance of theory, which can compute/predict something on behalf of research of models, which don't lead into any specific predictions?

    ReplyDelete
  32. /*quantum effects of gravity are weak*/

    How did you come into it? The quantum effect to gravity manifest itself with all phenomena existing between quantum mechanics and general relativity scales: I mean all these Cassimir forces, dipole forces, common life forces belong into it. Why you're removing it from the subject of quantum gravity theory? What do you expect you could achieve with such reductionism?

    ReplyDelete
  33. Zephir,

    I don't work on grand unification, I never have, and as I have expressed many times on this blog, I think that the whole idea of a theory of everything is scientifically questionable and a waste of time. As I said to Giotis above, it doesn't bother me at all if the masses of particles are just parameters and have no further explanation. I'm not looking for truth, I'm looking for a useful model that describes observation. Does that answer your question why I don't spend time on every attempt to fit particle masses with some limited number of parameters that has ever been put forward? I hope it does. Best,

    B.

    ReplyDelete
  34. /*..I don't work on grand unification..*/

    If you're working on quantum gravity, then you're dealing with it. Or we can just ask, what makes the quantum gravity so attractive in your eyes... I presume, it's just an idea of unification of general relativity with quantum mechanics. Does some bigger unification exist?


    /*it doesn't bother me at all if the masses of particles are just parameters and have no further explanation..*/

    Why it doesn't bothers you? This is the whole story of quantum gravity theories: to predict something experimentally testable. Of course, there are many other parameters (life-time of particles for example), but the Heim's theory can calculate them too. The whole trick is, Heim realized, how the space-time gets compacted/packed at the quantum scale.

    ReplyDelete
  35. Zephir, do yourself a favor and look up grand unification on wikipedia before you produce further ill-informed blather. Thanks,
    B.

    ReplyDelete
  36. I see, so we adhere on our semi-arrogant, semi-injured stance to the end of lives? BTW Heim did develop the Grand Unification theory neither. He just made testable predictions. One after another. It was his way.

    The point of your article was, we should learn from unsuccesfull theories to make them better. But how about to learn from succesfull ones? Here we face the psychological barrier, because everyone wants to follow the less succesfull, but still private route, rather than the succesfull, but less original one. Until someone is willing to pay it, indeed.

    ReplyDelete
  37. Giotis, Bee:

    One reason I asked my question is that I'm having trouble understanding what the environmental degrees of freedom are that decohere a space-time. If there are only gravitational degrees of freedom how does it work? Is graviton-graviton scattering sufficient to decohere spacetime? Gravity-matter interactions often aren't enough to decohere the quantum behavior of matter, else we would not observe quantum behavior on earth.


    About being classical - I think physics is quantum, and "classical" is appearance only. It is obviously true that physics is classical when quantum effects are small, but I think that kind of thinking is what has us stuck in a rut. I'd rather think of the world as quantum with various processes working to disguise the quantum effects, rather than "classical except when quantum effects are significant".

    So, decoherence is one mechanism that disguises quantum effects; large number of quanta even without decoherence can lead to a non-quantum description being usable; are there any other mechanisms that lead to a non-quantum description being usable?

    Perhaps my questions do not make sense. My understanding that would also be useful, if you would indulge me.

    Thanks!
    -Arun



    ReplyDelete
  38. At first, I wondered about the supermodel photo, then I read "energy densities were high, curvature was large". OK, at least a non-anorexic supermodel. Claudia Schiffer, maybe. I won't even mention the no-hair theorem. :-)

    I've been waiting a long time to post a link to this photo: a famous actress with hair in a photo which is not that old: Penelope Cruz.


    ReplyDelete
  39. Hi Sabine,

    you wrote:


    "A TOE should be able to describe everything we observe, at least in principle. If it necessitates 200 parameters, so be it. Of course you can always hope that maybe there's some principle that explains all these 200 parameters, but a) why bother if you've already explained everything in the universe around you anyway and b) why should we expect this?"

    I'm not sure I understand...

    If a TOE has free undetermined dimensionless parameters (like a coupling) i.e. parameters that are not determined dynamically within the theory, then it has to take their values as input from experiment (at a certain scale) in order to make any predictions.

    Then this TOE does not explain everything; it does not explain the value of the parameters in particular.

    The purpose of a theory is to explain experiment.

    ReplyDelete
  40. Arun, you usually take as environmental dof certain fluctuations of various fields (including the gravitational).

    Decoherence in a measurement process gives the appearance of a classical world (i.e. no superposition is observed macroscopically). QM is still there but in a way it is diluted in the environment and becomes unreachable/unobservable for the system you study/measure. You have to integrate out all these small correlations with the environmental dof and thus the system you study *appears* classical.

    What Sabine said about QM corrections is obviously correct. I'm not sure why you disagree.

    Everything is Quantum mechanical but appears classical at certain limits. Take the Heisenberg uncertainty principle for example.

    If you throw a ball with big mass then you still have the QM fuzziness in the trajectory but the corrections due to the uncertainty principle are negligible and unobservable. So the trajectory *appears* classical.

    ReplyDelete
  41. Giotis,

    The purpose of a theory is to make predictions. It's perfectly fine to measure some parameters first and then use these to make predictions. I think that the type of self-explanatory theory that you seem to have in mind doesn't exist. Best,

    B.

    ReplyDelete
  42. It's fine for a theory but not for a Theory of Everything.

    The existence of undetermined dimensionless parameters signals that your theory is not complete; you can adjust it and thus your theory is not the final theory of everything.

    This is self evident. I'm not sure why you are arguing this.

    Of course such a theory exist. String theory does not have free dimensionless parameters as I mentioned earlier.

    The string coupling is determined by the dynamics of a field, the dilaton field.

    ReplyDelete
  43. @Sabine,

    for a better understanding with respect to your comment

    "The purpose of a theory is to make predictions..."

    I think the wide-spread understanding what a theory is matches the explanation of WP-en (https://en.wikipedia.org/wiki/Scientific_theory):

    "A scientific theory is a well-substantiated explanation of some aspect of the natural world, based on a body of knowledge that has been repeatedly confirmed through observation and experimentation. Scientists create scientific theories from hypotheses [...] scientific theories are inductive in nature [...]"

    With that in mind, I do not understand what you mean with the remark that a theory that uses measured parameters for its predictions is self-explanatory.


    @Giotis,

    I agree that a TOE that uses assumptions or not explained parameters does not explain everything.

    I am very convinced that such a theory does not exist. Nevertheless, I do not negate that ongoing efforts (GUT, TOE) will produce results, and I see the search even as a necessary work. But, I am convinced that we will see at the end of this current search the need of the next GGUT or TOTOE to incorporate phenomena that we do not know and expect until now.

    ReplyDelete
  44. Hi Giotis,

    Well, I think we're just fighting about the meaning of a word. If you don't like to call it a theory of everything if it explains all observations, I don't see the point in arguing with you about that. String theory isn't self-explanatory either. Why strings? Why anything to begin with? To justify that, you have to draw upon observation already. And once you're drawing on observation you have measurements of parameters. Best,

    B.

    ReplyDelete
  45. Hi Michael,

    "I do not understand what you mean with the remark that a theory that uses measured parameters for its predictions is self-explanatory."

    I didn't say that. What I said is that I strongly doubt a self-explanatory theory of everything exists. You always need reference to observation for your theories. As I said, science is all about describing nature. Best,

    B.

    ReplyDelete
  46. Instead of response I will quote some verses as a tribute to the Theory of Everything.

    First pessimistic the other optimistic:-)

    "There is no chance at all. We are all trapped by a singular fate. No one ever finds the one."

    Charles Bukowski


    "...and I will be able now to possess the truth within one body and one soul."

    'A Season in Hell' by Arthur Rimbaud

    ReplyDelete
  47. "Einstein showed in SR that time can only be measured by events in relation to standard yardsticks (metersticks). It may be that events, like the absorbion of photons on the eyeballs of an observer that is counting out distance markers, occur many orders of magnitude FASTER in those earlier epochs when energy was denser than now."

    Change FASTER to MORE FREQUENTLY. I've learned that if someone can misinterpret something due to imprecision of language, then someone will. I'm not arguing for violation of Lorentz Invariance. Never have, and probably never will.

    ReplyDelete

COMMENTS ON THIS BLOG ARE PERMANENTLY CLOSED. You can join the discussion on Patreon.

Note: Only a member of this blog may post a comment.