Pages

Wednesday, February 07, 2018

Which problems make good research problems?

mini-problem [answer here]
Scientists solve problems; that’s their job. But which problems are promising topics of research? This is the question I set out to answer in Lost in Math at least concerning the foundations of physics.

A first, rough, classification of research problems can be made using Thomas Kuhn’s cycle of scientific theories. Kuhn’s cycle consists of a phase of “normal science” followed by “crisis” leading to a paradigm change, after which a new phase of “normal science” begins. This grossly oversimplifies reality, but it will be good enough for what follows.

Normal Problems

During the phase of normal science, research questions usually can be phrased as “How do we measure this?” (for the experimentalists) or “How do we calculate this?” (for the theorists).

The Kuhn Cycle.
[Img Src: thwink.org]
In the foundations of physics, we have a lot of these “normal problems.” For the experimentalists it’s because the low-hanging fruits have been picked and measuring anything new becomes increasingly challenging. For the theorists it’s because in physics predictions don’t just fall out of hypotheses. We often need many steps of argumentation and lengthy calculations to derive quantitative consequences from a theory’s premises.

A good example for a normal problem in the foundations of physics is cold dark matter. The hypothesis is easy enough: There’s some cold, dark, stuff in the cosmos that behaves like a fluid and interacts weakly both with itself and other matter. But that by itself isn’t a useful prediction. A concrete research problem would instead be: “What is the effect of cold dark matter on the temperature fluctuations of the cosmic microwave background?” And then the experimental question “How can we measure this?”

Other problems of this type in the foundations of physics are “What is the gravitational contribution to the magnetic moment of the muon?,” or “What is the photon background for proton scattering at the Large Hadron Collider?”

Answering such normal problems expands our understanding of existing theories. These are calculations that can be done within the frameworks we have, but calculations can be be challenging.

The examples in the previous paragraphs are solved problems, or at least problems that we know how to solve, though you can always ask for higher precision. But we also have unsolved problems in this category.

The quantum theory of the strong nuclear force, for example, should largely predict the masses of particles that are composed of several quarks, like neutrons, protons, and other similar (but unstable) composites. Such calculations, however, are hideously difficult. They are today made by use of sophisticated computer code – “lattice calculations” – and even so the predictions aren’t all that great. A related question is how does nuclear matter behave in the core of neutron stars.

These are but some randomly picked examples for the many open questions in physics that are “normal problems,” believed to be answerable with the theories we know already, but I think they serve to illustrate the case.

Looking beyond the foundations, we have normal problems like predicting the solar cycle and solar weather – difficult because the system is highly nonlinear and partly turbulent, but nothing that we expect to be in conflict with existing theories. Then there is high-temperature superconductivity, a well-studied but theoretically not well-understood phenomenon, due to the lack of quasi-particles in such materials. And so on.

So these are the problems we study when business goes as normal. But then there are problems that can potentially change paradigms, problems that signal a “crisis” in the Kuhnian terminology.

Crisis Problems

The obvious crisis problems are observations that cannot be explained with the known theories.

I do not count most of the observations attributed to dark matter and dark energy as crisis problems. That’s because most of this data can be explained well enough by just adding two new contributions to the universe’s energy budget. You will undoubtedly complain that this does not give us a microscopic description, but there’s no data for the microscopic structure either, so no problem to pinpoint.

But some dark matter observations really are “crisis problems.” These are unexplained correlations, regularities in galaxies that are hard to come by with cold dark matter, such as the Tully-Fisher-relation or the strange ability of dark matter to seemingly track the distribution of matter. There is as yet no satisfactory explanation for these observations using the known theories. Modifying gravity successfully explains some of it but that brings other problems. So here is a crisis! And it’s a good crisis, I dare to say, because we have data and that data is getting better by the day.

This isn’t the only good observational crisis problem we presently have in the foundations of physics. One of the oldest ones, but still alive and kicking, is the magnetic moment of the muon. Here we have a long-standing mismatch between theoretical prediction and measurement that has still not been resolved. Many theorists take this as an indication that this cannot be explained with the standard model and a new, better, theory is needed.

A couple more such problems exist, or maybe I should say persist. The DAMA measurements for example. DAMA is an experiment that searches for dark matter. They have been getting a signal of unknown origin with an annual modulation, and have kept track of it for more than a decade. The signal is clearly there, but if it was dark matter that would conflict with other experimental results. So DAMA sees something, but no one knows what it is.

There is also the still-perplexing LSND data on neutrino oscillation that doesn’t want to agree with any other global parameter fit. Then there is the strange discrepancy in the measurement results for the proton radius using two different methods, and a similar story for the lifetime of the neutron. And there are the recent tensions in the measurement of the Hubble rate using different methods, which may or may not be something to worry about.

Of course each of these data anomalies might have a “normal” explanation in the end. It could be a systematic measurement error or a mistake in a calculation or an overlooked additional contribution. But maybe, just maybe, there’s more to it.

So that’s one type of “crisis problem” – a conflict between theory and observations. But besides these there is an utterly different type of crisis problem, which is entirely on the side of theory-development. These are problems of internal consistency.

A problem of internal consistency occurs if you have a theory that predicts conflicting, ambiguous, or just nonsense observations. A typical example for this would be probabilities that become larger than one, which is inconsistent with a probabilistic interpretation. Indeed, this problem was the reason physicists were very certain the LHC would see some new physics. They couldn’t know it would be the Higgs, and it could have been something else – like an unexpected change to the weak nuclear force – but the Higgs it was. It was restoring internal consistency that led to this successful prediction.

Historically, studying problems of consistency has led to many stunning breakthroughs.

The “UV catastrophe” in which a thermal source emits an infinite amount of light at small wavelength is such a problem. Clearly that’s not consistent with a meaningful physical theory in which observable quantities should be finite. (Note, though, that this is a conflict with an assumption. Mathematically there is nothing wrong with infinity.) Planck solved this problem, and the solution eventually led to the development of quantum mechanics.

Another famous problem of consistency is that Newtonian mechanics was not compatible with the space-time symmetries of electrodynamics. Einstein resolved this disagreement, and got special relativity. Dirac later resolved the contradiction between quantum mechanics and special relativity which, eventually, gave rise to quantum field theory. Einstein further removed contradictions between special relativity and Newtonian gravity, getting general relativity.

All these have been well-defined, concrete, problems.

But most theoretical problems in the foundations of physics today are not of this sort. Yes, it would be nice if the three forces of the standard model could be unified to one. It would be nice, but it’s not necessary for consistency. Yes, it would be nice if the universe was supersymmetric. But it’s not necessary for consistency. Yes, it would be nice if we could explain why the Higgs mass is not technically natural. But it’s not inconsistent if the Higgs mass is just what it is.

It is well documented that Einstein and even more so Dirac were guided by the beauty of their theories. Dirac in particular was fond of praising the use of mathematical elegance in theory-development. Their personal motivation, however, is only of secondary interest. In hindsight, the reason they succeeded was that they were working on good problems to begin with.

There are a few real theory-problems in the foundations of physics today, but they exist. One is the lacking quantization of gravity. Just lumping the standard model together with general relativity doesn’t work mathematically, and we don’t know how to do it properly.

Another serious problem with the standard model alone is the Landau pole in one of the coupling constants. That means that the strength of one of the forces becomes infinitely large. This is non-physical for the same reason the UV catastrophe was, so something must happen there. This problem has received little attention because most theorists presently believe that the standard model becomes unified long before the Landau pole is reached, making the extrapolation redundant.

And then there are some cases in which it’s not clear what type of problem we’re dealing with. The non-convergence of the perturbative expansion is one of these. Maybe it’s just a question of developing better math, or maybe there’s something we get really wrong about quantum field theory. The case is similar for Haag’s theorem. Also the measurement problem in quantum mechanics I find hard to classify. Appealing to a macroscopic process in the theory’s axioms isn’t compatible with the reductionist ideal, but then again that is not a fundamental problem, but a conceptual worry. So I’m torn about this one.

But for what crisis problems in theory development are concerned, the lesson from the history of physics is clear: Problems are promising research topics if they really are problems, which means you must be able to formulate a mathematical disagreement. If, in contrast, the supposed problem is that you simply do not like a particular aspect of a theory, chances are you will just waste your time.



Homework assignment: Convince yourself that the mini-problem shown in the top image is mathematically ill-posed unless you appeal to Occam’s razor.

33 comments:

  1. Re the problem, I got 5. But the answer could also be 42.

    ReplyDelete
  2. "experimentalists...low-hanging fruits have been picked" "theorists...physics predictions don’t just fall out of hypotheses" Grant funding limits theory to accepted postulates. Theory derives acceptable observations. Truth remains hidden where it is by where it should be.

    "cold dark matter" is quantifiably Milgrom acceleration: one day observing cryogenic enantiomers' "impossible" microwave rotational spectra divergence. Dr. Melanie Schnell plus D_3-trishomocuban-2-nitrile has a third cherry atop the self-calibrating whipped cream, DOI:10.1515/zna-1986-1107. Fifty years and a million+ pages of empirically sterile mathematics condemns itself.

    Truth is where it is, science. Where it should be is political science – ideas not facts.

    ReplyDelete
  3. In my opinion the dark matter/dark energy problem is more than just observations of energies not accounted for in GR, and trying to solve it is equivalent to Einstein solving Newtonian mechanics not being compatible with electrodynamics. Just as Einstein couldn’t modify Newtonian mechanics for his solution, I think physicists working on the problem can’t modify GR, nor merely create hypothetical energies. The latter sounds eerily familiar to hypothesizing unseen dust or planet as an explanation for Mercury’s orbital precession under Newtonian mechanics.

    ReplyDelete
  4. The next number in the series
    1, 16, 81, 256, ...
    is obviously 601. Why?

    Hint: 625-601 = 4!

    ReplyDelete
  5. Typo (?) alert: "Such calculations, however, are hideously difficult. They are today made by use of sophisticated computer cute – “lattice calculations” – and even so the predictions aren’t all that great."

    I assume that was supposed to be "computer code".

    ReplyDelete
  6. How would yo classify QCD's problem with CP violation, where the theta parameter is zero with some incredibly high precision?

    ReplyDelete
  7. What do you think of decoherence as an answer to the quantum measurment problem?

    ReplyDelete
  8. CW,

    Thanks for pointing out. Funny typo! I've fixed that.

    ReplyDelete
  9. ppnl,

    Decoherence doesn't solve the measurement problem, it merely explains why pure states decohere to mixed states. It does not explain the update to an eigenstate upon measurement.

    ReplyDelete
  10. Stephen,

    It's a finetuning problem, and hence it's not a well-defined problem unless you have a probability distribution for the parameter to show that the measured value is unlikely.

    ReplyDelete
  11. Bee, I'm curious. Did you understand the comments that start "Re: the problem ..." and "the next number in the series..."? Those comments are there because your blog is showing a block at upper right that I think is advertising but your readers (including me, initially) thought was a problem posed by you. Are they confused or am I?

    ReplyDelete
  12. Jeff,

    It's not an advertisement. I placed it there. I understand what the comments refer to. Please also see the PS to my blogpost for context.

    ReplyDelete
  13. My dad always said, ask the material the question. I think what he understood is that the research to figure out which is the best way to test, wastes a lot more energy than just testing everything. Like always. This rule is right 80-90 percent of the time.

    ReplyDelete
  14. Michael,

    This option is not available for the foundations of physics. Experiments are too costly. You can't do them all - some selection has to be made.

    ReplyDelete
  15. A good read; and I agree.
    I've followed a different road - fractals - to the cosmology and quantum problems; actually it was the other way around, I found these problems and solutions in the fractal and got to know cosmology and quantum from them. I think the fractal attractor is being ignored, but it has been fun for me.
    B

    ReplyDelete
  16. Re Michael Sarnowski and Bee: Baryogenesis, Tully-Fisher, QM versus GR have no "acceptable" observations remaining by kind. Physics' log-log plots never terminate, demanding "bigger." Minimum $1.4 billion DUNE dark matter detector is 68,000 tons of liquid argon 4850 feet underground.

    A ppb chiral anisotropic vacuum background (GR superset Einstein-Cartan) heals everything. It is mathematically hideous. Look, then theory or not.

    A small bottle of volatile white crystals plus academic overhead might be $(USD)50,000. Add another $50K to max out Dr. Schnell’s spectrometer. DUNE's donut budget pays for the new look.

    ReplyDelete
  17. Convince yourself that the mini-problem shown in the top image is mathematically ill-posed unless you appeal to Occam’s razor.

    Yes, and even with Occam’s razor it may be ambiguous. Labeling the triangle vertices a,b,c counter-clockwise from top, one person might notice that the central number is always ab-c, but another person might notice that the central number is always (-59a–6b+94c)/5. The first pattern may seem simpler, but it treats the vertices asymmetrically (e.g., if the values had the same units, the expression ab-c wouldn’t even make sense), whereas the second pattern is a linear homogeneous expression. Of course, the second pattern involves three seemingly arbitrary constants, fit to the three examples, so we can’t have any confidence in the predictive power (unless we know that the pattern must be a linear combination of the vertices).

    …at least concerning the foundations of physics.

    I’m curious about your use of the term “foundations of physics” here. From the context, it seems you are talking about what’s usually called “fundamental physics”, i.e., the study of fundamental particles, etc. I think the term “foundations of physics” is most commonly used to refer to the study of interpretational and philosophical issues like the “measurement problem”. (I notice that you said later you find the measurement problem hard to classify.) Do you distinguish between “fundamental physics” and “foundations of physics”?

    ReplyDelete
  18. Amos,

    Yes, even Occam's razor allows various "equally good" explanations.

    I use the term "foundations of physics" to mean "the fields concerned with what is presently most fundamental," meaning (parts of) cosmology, particle physics, quantum foundations, and quantum gravity. Best,

    B.

    ReplyDelete
  19. Sabine,

    No, decoherence does not explain how one possibility is selected. But I doubt anything can. It is not a deterministic process so there cannot be a causal explanation. It isn't even clear what would count as an explanation.

    But what decoherence does is tell us exactly what a measurement is. A measurment is nothing more than an interaction that causes decoherence. That decoherence is what makes the large scale universe look realistic and deterministic despite the fact that at the bottom it is neither realistic nor deterministic. So we can understand measurement and wave collapse as a process that includes an irreducible random component.

    ReplyDelete
  20. Homework assignment: Convince yourself that the mini-problem ("What number fits into the last triangle?") shown in the top image is mathematically ill-posed unless you appeal to Occam’s razor.

    I took a crack at the so-called mini-problem and found that numbers such as 5, .5, e, as well as many more, fit out-of-the-box. I tried fitting additional numbers, by changing the base, by using various notations, by scaling the triangles up or the text size down (or both in various clever combinations), and I was successful in fitting so many more. Great. I came to the conclusion that the question shall be "What number doesn't fit into the last triangle?" Very interesting problem, as all numbers so far failed to fail, which indeed may be an indication that more work is needed to find the solution. That'll keep me busy for a while now, thank you!

    ReplyDelete

  21. I think that you are missing out on a whole category of problems - 'unknown problems'.

    In today's world, it's likely Einstein would never get published and Newton would be only selling horoscopes. These people looked at what was going on in the world around them and struck out in directions that were essentially orthogonal to current thought, paying little heed to the 'good research problems' of the day. Ref: The Sleepwalkers: by Arthur Koestler and works by Paul Feyerabend such as Against Method.

    While 'good research problems' are a fine path to follow they are not the only reasonable path to take.

    ReplyDelete
  22. Bee,

    I know he comes up often in discussions of scientific methodology and the like, but Kuhn's theory really doesn't seem to hold up at the end of the day. And not for any reasons concerning misplaced relativism or social constructivism (though I do, as would most other scientists, disagree with either of those ideas of what science "is" as being remotely true).

    It's just that the central ideas of Kuhn's thesis concerning periods of "normal science" and then "paradigm shifts" and the like simply isn't how science seems to function.

    I may have mentioned it before, but two books you should seriously consider are David Wootton's "The Invention of Science" and David Deutsch's "The Beginning of Infinity." Those two cover a lot of what anyone up-to-date on philosophy of science and/or scientific methodology (and its history) should be exposed to.

    ReplyDelete
  23. Thanks for the awesome reference list and cool classification!
    Personally I'd assign more meaning to DM and DE, but such subtleties will hopefully be easy to assess in a few years.
    I can't help saying that I'm very glad whenever people speak about QFT expansion issues. No need for me to expand about how real, tough and long-standing problems are customarily dismissed as passé.

    ReplyDelete

  24. Hello,

    "measurement problem in quantum mechanics" ??

    I never understood the "measurement problem" in quantum mechanics. I do not see any "problem" but just a "measurement".

    I always understood quantum states as information about physical objects and not as physical objects themselves. Just like a Gaussian distribution is an information about the distribution a particular variable and not the variable itself.

    When you measure an observable, you just update your knowledge about the system. Physicists tend to call this update a “collapse of the wave function” (why ??). why not calling this "measurement".

    Not be taken to seriously: It seems to me that physicists tend to confuse the mathematical tools with reality.

    Regards

    ReplyDelete
  25. Convince yourself that the mini-problem shown in the top image is mathematically ill-posed unless you appeal to Occam’s razor.

    The problem is ill posed because there are an infinite number of functions that can generate any finite sequence of numbers. The "real" solution depends on the implicit conditions that the writer of the puzzle added, i.e., it must be solvable by a certain type of reader.

    That approach to solving problems will not work in physics: The 'Occam's razor' solution to the movement of the planets is neither GR nor QM, but classical mechanics. So, we need problems that are not solvable by any of the current theories to force us a step further.

    Personally, I like this problem as it forces theoretical physicists to think beyond GR while still having at least some real data to work with:
    "These are unexplained correlations, regularities in galaxies that are hard to come by with cold dark matter, such as the Tully-Fisher-relation or the strange ability of dark matter to seemingly track the distribution of matter."

    But that is just my personal taste.

    ReplyDelete
  26. C.M.,

    The problem is that in QM it was unclear exactly what a measurment is. For example Schrödinger's cat is presumed to be in a combination of alive/dead state until it is measured. But the cat can clearly see if the poison vial broke. Why isn't that a measurment? It wouldn't make any difference in classical physics but in quantum physics it becomes very important.

    In QM a measurment changes the state of the measured thing in a strange way that makes it much more than an update of information.

    I think professor Hossenfelder is conflating the measurment problem with the randomness of qm. I think these are two seperate issues.

    ReplyDelete
  27. ppnl,

    Concerning the "Schrödinger cat":
    For me, the quantum measurement is an update of the observer's knowledge. The observer’s knowledge about the cat is in a superposition until she/he makes a measurement. “Superposition” is on the observer’s side. It’s not a physical thing.
    And yes, in quantum mechanics there is a fundamental randomness which forces you to use probabilities distributions to talk about fundamental objects.
    Maybe I miss a point but I don’t see any “problem” with quantum measurement. It’s just like any measurement equipped with an unavoidable randomness (because a measurement cannot be infinitely precise and because of the Heisenberg principle).
    We tend to confuse the language and the 'things'.
    Regards

    ReplyDelete
  28. ppnl,

    Concerning the "Schrödinger cat": For me, the quantum measurement is an update of the observer's knowledge. The observer’s knowledge about the cat is in a superposition until she/he makes a measurement. “Superposition” is on the observer’s side. It’s not a physical thing.
    And yes, in quantum mechanics there is a fundamental randomness which forces you to use probabilities distributions to talk about fundamental objects.
    Maybe I miss a point but I don’t see any “problem” with quantum measurement. It’s just like any measurement with an unavoidable randomness on the top of it.
    We tend to confuse the language and the 'things'.

    ReplyDelete
  29. Schrodinger's cat and quantum entanglement are effects arising from the absence of a third reference frame/observer.It is the third observer who defines what is a 0 or 1 state.

    ReplyDelete
  30. One interesting class of problems that doesn't seem to fit neatly into this schema are "why" problems which I also sometimes call in the area of fundamental physics "within the Standard Model" problems.

    Imagine (possibly contra-factually) that the Standard Model is all that there is to non-gravitational physics and that we measure all of the constants in the Standard Model to much greater precision.

    There will never be a time when the Standard Model conflicts with observation or that the Standard Model would be inconsistent in that situation.

    But, suppose that you come up with a theory that explains the values of all of the mass, CKM and PMNS constant values from the SM coupling constant values and the Higgs vev with precision about the same as current experimental values and a well motivated mechanism. Imagine that this theory doesn't actually give rise to any "New Physics" that the SM doesn't already predict. This would still, in my view, be a worthwhile field of pursuit that would give us a deeper understanding fundamental physics, but I'm not sure where it would fit in these schema of problems.

    ReplyDelete
  31. andrew - such a theory would be a massive breakthrough, and it would be followed by normal science at a higher level, because you could predict the next decimal place of the SM constants, and then test it.

    ReplyDelete
  32. C.M.

    You seem to be falling victim to a hidden variables theory when you say "'Superposition' is on the observer’s side."
    Superposition is physical, and not a statement about the observer's knowledge. There is no true underlying state, latent and waiting to be measured. If you want confirmation of this, check out the Bell Inequalities.

    Unless I've misunderstood your point?

    ReplyDelete
  33. Unknown

    I am not a proponent of "hidden variables". To me, QM is a complete, efficient and tested theory about "what we can say about nature".

    QM is an extension of classical physics taking into account physical objects AND the observer. All we can do is to compute probabilities.

    To me, there is no such thing as a “measurement problem”, the quantum state is a state of knowledge (like a probability distribution used in the macroscopic world).

    The “collapse of the wave function” is an update of the observer knowledge (like any measurement in the macroscopic world).

    I do not see any problem with this. Do you ?

    ReplyDelete

COMMENTS ON THIS BLOG ARE PERMANENTLY CLOSED. You can join the discussion on Patreon.

Note: Only a member of this blog may post a comment.