Saturday, September 12, 2009

The Minimal Length in Quantum Gravity: An Outside View

I recently came across a paper by Amit Hagar
    “Minimal Length in Quantum Gravity and the Fate of Lorentz Invariance”
    Studies of History and Philosophy of Modern Physics 40(3): 259-267 (PDF)

Amit Hagar is an assistant professor at Indiana University's Department of History and Philosophy of Science, and he has taken an interest in the history of a minimal length and the current discussion about deformations of Lorentz invariance. And it is true indeed that the existence and implementation of a minimal length in quantum gravity is an intriguing open question. Hagar uses it as “a case study for highlighting the delicate balance between conservatism and innovation that characterizes the process of constructing new physics.”

I find his paper very refreshing, though in some aspects misleading and the argumentation incomplete.

To briefly summarize the basics,

there are many motivations, stemming from different approaches to quantum gravity and various thought experiments, that there is a fundamental limit to how well we can resolve structures (for a summary see eg “Quantum gravity and minimum length” by Luis J. Garay and “On Gravity and the Uncertainty Principle” by Adler and Santiago). This limit is generally thought to be at or close by the Planck scale. This is far off what we can experimentally test today, thus the lack of experimental data.

However, such a finite minimal distance scale causes a problem with Lorentz invariance since Special Relativity tells us a ruler in motion towards us appears shortened. A minimal length better shouldn't appear shorter than minimal. This reasoning thus creates the need to modify Special Relativity, which is very hard to do in a self-consistent and observer independent way. Attempts have become known under the name “Deformed Special Relativity.” Such modifications of Special Relativity can imply modified dispersion relations and an energy dependent speed of light, though the theoretical basis for these theories is presently incomplete and partially inconsistent. Note that modified dispersion relations are quite easily obtained also from preferred frame effects. The point of DSR has been that it does respect relativity of inertial frames.

I have argued in this paper that the alleged problem isn't since there is no observation and thus no contradiction without an interaction. The only thing necessary for self-consistency is then that no interaction can ever resolve structures below the Planck scale, but there is no need to modify the Lorentz boosts for freely propagating particles. This is, in a nutshell, the main difference between my model and the standard DSR approach.

DSR is generally thought to be not a fundamental theory on its own, but an approximate description applicable to incorporate effects of quantum gravity in the particle/quantum field context. People differ on what approximation it is supposed to describe, but the point is there might not be an obvious way to find such a modification in the fundamental theory since it could only be an effective description. Take as an example friction. There's no friction inside the atom, and there's no friction in planetary orbits either. Yet on intermediate scales Cosmopolitan avidly advocates lubricants.

The point of view I've been taking (which of course isn't shared by everybody) is that quantum field theory with a minimal length and DSR is a way to incorporate still little understood quantum gravitational effects that would be described by a fully consistent yet-to-be-found theory into the old-fashioned theories we already have by adding a generalized uncertainty principle, a modification of dispersion relations and a deformation of momentum space. I like this approach because it bridges the gap towards phenomenology. It is however unsatisfactory there is presently no derivation from fundamental principles.

But to come back to Hagar's paper,

he studies some arguments that have been raised against such deformations of Lorentz invariance and finds the criticism wanting. On the other hand, he also finds the theoretical motivation for having such a modification unconvincing, though the attempt to do so makes a nice object of study for the philosopher

“While so far there seems to be little physical motivation for deforming the standard energy-momentum dispersion relations (apart from the fact that there are good reasons to think that a fundamental QG theory will involve spatial discreteness), from the methodological perspective I am interested in here the attitude within the QG community towards DSR exemplifies nicely the aforementioned delicate balance between conservatism and innovation.”
I can basically picture the string theorists among the readers grinding their teeth. I'll leave it up to you whether you think reasons for spatial discreteness are “good,” since it is actually a different question than whether there is a finite resolution, and the matter of discreteness is thus not relevant to the topic under discussion. One can have a fundamentally finite resolution of structures without spatial discreteness, and one can also have spatial discreteness without violations of Lorenz invariance. Unfortunately, these issues are quite often confused. Hagar does mention these differences later on, but this introduction of his paper is somewhat misleading.

Hagar discusses an argument by Schützhold and Unruh according to which a position space description of DSR either involves large scale non-locality inconsistent with our current theories and observations, or it necessitates a preferred frame. Hagar concludes the argument is unconvincing since it makes use of unwarranted assumptions about the Fourier transformations in such a framework. While I agree with Hagar's criticism, I did a similar analysis in this paper without making use of Fourier transformations and came to essentially the same conclusion: If one has an energy dependent speed of light, one either needs a preferred frame, or one needs an external parameter to label Lorentz transformations. This parameter is commonly chosen to be an energy (don't ask the energy of what), but besides the ambiguous interpretation this is a non-local modification that seems to me as unnatural as implausible [1].

Anyway, despite me finding the argumentation in Hagar's paper rather incomplete, I very much like the attempt to disentangle the discussion and approach it from a logical and objective basis. You see, I have stakes in the issue, so has everybody else who has worked on the topic. If you read a random paper on DSR it will tell you how natural such a modification is, how plenty the motivations, how great the prospects to experimentally test it - and be kinda brief on the “well-known” inconsistencies. Hagar's paper makes a nice contrast to this by telling the story as it really is.

I wrote an email to Amit Hagar,

and he kindly replied, letting me know he is “an avid reader of [my] blog and papers, and the truth is they have very much inspired [his] looking into this interesting debate.” I am very flattered. But what's even better is that he tells me he plans to write a book on the history and philosophy of the minimal length, starting from Heisenberg up to now. I think it is a great idea. The history of the topic is full with beautiful thought experiments and arguments about their implications, and the whole field would benefit from a clear summary.

Related:



[1] Such modifications run under the keyword “energy dependent metric.” Note that we are talking here about an energy dependent metric in position space, not momentum space.

50 comments:

  1. Special Relativity tells us a ruler in motion towards us appears shortened. Depends. If it is aimed right at your nose, yes. If it is passing by you get Terrell rotation not relativistic foreshortening.

    Consider perceived superluminal propagation in astronomic relativistic jets aimed at the observer. Plot the propagation of wavefronts in a spacetime diagram. Superluminal affects are artifacts of viewer perspective. Curve fitting works for interpolation of nice functions. Economics is dire warning to anybody who tries extrapolation with his own money.

    re Cosmo: We have sucrose polyester, but what are we doing with it?

    ReplyDelete
  2. While I am happy for the attention the paper has drawn, I would like to respond to the "incomplete" and "misleading" claims.

    1. "Incomplete": as the abstract states, the main purpose of the paper was to expose a specific methodological trend I believe is common to at least two arguments against DSR. That one can respond to these arguments in other ways, as Bee has done in her papers, is, of course, a welcome addition, but it adds little to my original goal. More specifically, I wanted to emphasize that the principle of the constancy of light is, on final account, an empirical statement, and that one cannot assume it a-priori in an argument against a theory like DSR (i.e., a theory in which the same principle is being questioned). This type of methodology is not uncommon in the history of physics - the most (in)famous example is the uncalled for exorcism of Maxwell's demon with arguments that presuppose the universal applicability of thermodynamics, and specifically of the 2nd law, whereas this universality was exactly what Maxwell was trying to question with his demon.

    2. "Misleading": again, truthful to my goal, and being neutral wrt to DSR, I had to give some background for the motivation behind it. I explicitly mention that there is an important distinction between minimal length's being an epistemic feature (i.e., a limitation on our measurements, what Bee calls "finite resolution"), and its being an ontological feature (i.e., a result of spatial discreteness). As most philosophers know, while the latter does entail the former, it need not be the case vice versa, so I believe I haven't mislead anybody here who wasn't already mislead :)

    In parenthesis, it is interesting that in the 60s there appeared two papers on minimal length by Alden Mead, then a physical chemist from the University of Minnesota, that explicitly showed how quantum fluctuations of the gravitational field give rise to finite resolution of measurement. It took Mead, by the way, almost 5 years to persuade the editors of the Physical Review to publish his first paper...

    ReplyDelete
  3. Rovelli's argument is used to rule out DSR or to prove that exact LI is true? I don't think so. As I understand it, this argument is used to show that LQG is not necessarily incompatible with LI as some people claim i.e. to show that the following 3 could hold simultaneously:

    1) LQG is true

    2) Lorentz invariance is true

    3) Minimal length as an eigenvalue of an operator is true


    So you don't have to modify anything. In this model LI is assumed to be true.

    So the title of chapter 4.1 in this paper is correct "Violations of Lorentz invarinance are not implied by LQG". But later the argumentation implies that according to Rovelli's argument LQG rules out DSR or proves exact LI (which is wrong) in contrast to the title of the chapter.

    ReplyDelete
  4. Giotis, you say: "But later the argumentation implies that according to Rovelli's argument LQG rules out DSR or proves exact LI (which is wrong) in contrast to the title of the chapter."

    Pls read carefully. No where did I say that Rovelli "proves exact LI".

    Some people motivate DSR with the intuitive tension between exact LI and minimal length, and Rovelli's argument is supposed to show that there is no such tension, hence no motivation for LI deformation. My only claim is that the argument, as it stands, is incomplete, and that if one attempts complete it, then it appears to presuppose LI.

    ReplyDelete
  5. Yes ok but you talked about ruling out DSR. What I'm saying is that Rovelli's argument was not constructed to rule out DSR and so it correctly presupposes LI since its purpose was to show that LQG is not incompatible with LI (the experimental fact) not that LQG rules out DSR.

    ReplyDelete
  6. This comment has been removed by the author.

    ReplyDelete
  7. This comment has been removed by the author.

    ReplyDelete
  8. Giotis, again, pls don't misrepresent me. No where did I say that "the purpose of Rovelli was to rule out DSR". If you read carefully the way I've reconstructed the debate, it is quite clear what is the intuition that Rovelli is going after in his argument, and this intuition is also one that is often cited as the motivation for DSR, but this is not the same as the claims you insist on attributing to me.

    My point is just that Rovelli's argument is too weak to rule out DSR, so it cannot be used as an argument against it, regardless of what its original purpose actually was...

    ReplyDelete
  9. Dear Amit,

    Thanks for your comments.

    Reg the "incompleteness," just given the length of the paper it had to be incomplete. It is not a serious criticism, I could say that about almost all papers on any topic. You can understand this comment of mine as: please write the book!

    I understand that you could not address all pros and cons in that paper. In the section where you discuss Schützhold and Unruh's paper though, you raise the impression the case is settled by this, but merely so by ignoring my later paper. While I am sorry for advertising my own work, let me add that it is a great frustration to me I wrote this paper just to have it widely ignored and people are still claiming one can have DSR without a preferred frame without even acknowledging I have shown you can't.

    Regarding misleading: It might indeed be you haven't mislead anybody who wasn't already mislead, but as I said above since it is a very common confusion your introduction just contributes to it. It is a weak excuse to say I've been sloppy because if you know what I'm talking about you know I'm sloppy. On the other hand, I can relate to you conundrum of writing an accessible introduction versus accuracy. It is a tension I encounter quite often with blogging. I usually try to err on the side of accuracy, just a footnote or a referral to a later section could have done the trick.

    In any case, please let me emphasize I really like the paper, I am just trying to be constructive. I am aware that this sometimes comes off wrong, but I guess it's just how I am. If you read this blog frequently you'll have noticed I always have something to criticize on anything :-)

    Best,

    B.

    ReplyDelete
  10. This comment has been removed by the author.

    ReplyDelete
  11. Hi Bee,

    A most interesting post, reviewing a paper whose writing is long overdue. To be honest I am still taking in what Amit Hagar has brought forth in this paper, as well as your own comments. Even though I am merely a novice in such things, I also would be interested to see a book written which comprehensively deals with the subject more generally.

    As you are aware my prime interest rests with the foundations and philosophical aspects of physics and I find this subject cuts straight to its heart in many respects. That being it touches on and relates to matters which have been a concern since before Zeno and in one guise or another still baffles and confound the experts. If Dr. Hagar does ever come to write his book he can mark me down for buying a copy.

    Best,

    Phil

    ReplyDelete
  12. In AWT Lorentz symmetry(LS) is simply the consequence of observational perspective. When we are observing low dimensional space (like 2D+1T water surface) from strictly 2D+1T perspective, LS is indeed maintained. When we are observing it from higher dimensional perspective, LS can be violated and nothing very special is about it. AWT stance is, every space-time is completely homogeneous from its own perspective by definition and its LS cannot be violated. At the moment, when we are discussing some homogeneities in it, we are applying higher dimensional perspective, which enables LS to become violated.

    In AWT concept of minimal length doesn't exist from global perspective, because even the tiniest density fluctuations can be formed by some more smaller ones here. But there exist a limit in observability of smallest density fluctuations from perspective of larger density fluctuations (like humans) or instrumentation, which was used for their detection. If we use more sensitive/large apparatus, the limit of fluctuations on both sides of dimensional scale will increase accordingly and we would observe Universe larger and quantum fluctuations smaller.

    The philosophical question is, if such dimensional scale is real for people, because it's always interpreted by apparatus. Science answered such question positively already from the time of Galilei and van Leeuwenhoek.

    ReplyDelete
  13. Hi Bee,

    for me it is confusing, to compare the finite length of the Lorentz rod and the relativity principle. I mean the minimal rod should exist as a limit of Special Relativity. Then you say a version of DSR might be the solution. That's what I've not really understood, since it is not clear on the first view. Although, I need to say, that I haven't red your paper on DSR. So I will read it.

    Best

    Kay

    ReplyDelete
  14. BTW "An Outside View" is always of higher dimensionality, then insider's view, so its LS can be violated by definition. If it wouldn't, we couldn't distinguish it from inside view, after all.

    The subtle problem is, outside perspective remains undetectable from insiders, so we are always talking about somehow abstract phenomena, which can be proven by higher dimensional emergent approach only, i.e. by coincidence of two or more indirect evidences - but not by direct observation. Whole evidence of emergent Aether concept is about it, after all.

    It should be pointed out, the existence of space-time at subPlanck scale (i.e. existence of "subminimal length") lies outside of observational perspective scope of insiders too, so we are relating existence of one unprovable phenomena (Lorentz symmetry violation) by existence of another one (subPlanck length).

    People like Lubos Motl - who are working with insintric perspective preferably - can say easily, both ideas are BS - whereas people who knows how emergent phenomena are working can expect, the combination of two or more undetectable phenomena (assumption) could still lead to new observable (i.e. testable) predictions. After all, renormalization approach is quite similar approach based on emergence, because its extrapolating singular function by pair of their derivations from both sides of divergence.

    ReplyDelete
  15. Hi Kay,

    Sorry, I don't understand your question. The problem is, if you take any spatial distance (call it a rod) and apply a standard Lorentz boost on it (as Uncle points out, into the direction of the rod), then the result is a reduced length of that rod. For that, it doesn't matter what the length of the initial rod was, it will always appear shorter in a different restframe. Thus, so the argument goes, the standard Lorentz transformations are incompatible with there being a minimal length. If that argument doesn't make sense to you, don't worry, as I mentioned above, it doesn't make sense to me either, but that's how the story is told. Best,

    B.

    ReplyDelete
  16. Hi Bee,

    to clarify: I meant a theory that comes out of Special Relativity as a limit, so that the rod can have a minimal length.

    Best Kay

    ReplyDelete
  17. Hi Kay,

    It should be the other way round, Special Relativity should be a limit of that new theory. Best,

    B.

    ReplyDelete
  18. Hi Bee,

    In response to Kay’s question could one not say the Schwarzschild radius is a composite or many particle minimum length, that can exist in what would be called normal space-time. That’s to say the maximum length would be what this radius would represent being for an irreducible (non composite) quanta; that is supposing there be such an entity.

    One thing that has always intrigued me is if one imagines an event horizon being accelerated to near light speed relative to an observer, it would appear to flatten and thus suggest the black hole had its entropy content thus diminished from it would be as considered on the surface of a sphere to that allowed for on a corresponding disk at the limit of c. This would seem to imply that entropy is not a conserved quantity irrespective of reference frame. Its one thing to say that mass/energy is a quantity relative to an observer, yet as entropy is considered equivalent to information, does this not seem to be contradictory. Then again it’s more likely I have something confused.

    Best,

    Phil

    ReplyDelete
  19. Hi Phil,

    The Schwarzschild radius depends on the mass, it thus doesn't define a fixed length. If one ties the Schwarzchild radius to the Compton wavelength via the uncertainty principle, one obtains a length and a mass, which is exactly the Planck length and Planck mass (see mentioned earlier post).

    The Schwarschild radius is further defined in the restframe of the (in the simplest case spherical) geometry, such that the black hole doesn't move. Boosting it doesn't change anything about its entropy, for the same reason why trowing an icecube doesn't increase its temperature (neglecting friction), the former is a directed motion, the latter an undirected one. Best,

    B.

    ReplyDelete
  20. Hi Bee, hi Phil,

    thus putting all together, the reason is that, this theory would be a quantum theory, that has as its limit Special Relativity.

    Best Kay

    ReplyDelete
  21. Hi Bee,

    Yes I should have mentioned the Compton wave length, as particles are conventionally considered as points. Essentially what I meant to say is the Schwarzschild radius is tied to this concept of minimum length as well as being a measure of a black hole’s entropy content. I also then take it that the only way entropy is so considered is when the horizon is considered as the rest frame.

    The question being what does it mean to have a energy density (mass) higher then what can exist within normally considered space-time limits? From my admittedly novice perspective it seems to just as well give reason against having a black hole form, then reinforce reason for their existence. Of course theory and evidence stands more and more to support their existence, yet personally I much preferred the former Soviet scientist’s concept of them being, “замороженные звезды” or frozen stars and in some respect find it more consistent with the concept of there actually being a minimum length.


    Best,

    Phil

    ReplyDelete
  22. This comment has been removed by the author.

    ReplyDelete
  23. Hi Kay,

    Please don't take anything I might say too seriously when it comes to such matters, for I am just another person trying to find their way out of a dark room without having a flashlight :-)

    Best,

    Phil

    ReplyDelete
  24. This may be too OTTOMH, but: given the formula delta x * delta p >= hbar/2, the delta x does not have to be defined by photons. A very massive particle going very fast could reduce delta x to below the Planck Length, at least formally. I see "particles" mentioned in thread but AFAICT not per this specific condition. How do we fit particles into these adjustments, DSR, etc?

    BTW, anyone notice the oddity of defining the Compton/de Broglie wavelength in terms of the momentum of a given "particle"? What if two particles are attached in various ways, do they count as "one" or "two" etc. - just like in Galileo's Socratic question challenging Aristotelian physics about the speed of a given falling body! I'm sure some complex interaction issue works it out, but it still seems odd to pretend we can take for granted the boundaries of "the particle" used to get the mass for the matter-wave-length. How affiliated must they be, to act more like "one particle" than two or more traveling together? Someone must have worked on it, but I never see it discussed, worked out, examples etc.

    ReplyDelete
  25. Neil, the uncertainty relation you mention is for non-relativistic quantum mechanics. But yes, the point is exactly that in usual quantum mechanics you can get delta x to be arbitrarily small. The argument is that if you take into account gravity, you can't do so any more, and there is a lower limit to delta x, somewhere at l_p. I really do recommend you read the above mentioned earlier posts, or Garay's paper, which makes a nice introduction. We're talking here about free quantum mechanics, single non-interacting particles. You are talking about bound states, one can also talk about their wavelengths, generally though effects tend to be more pronounced for single particles, thus it suffices to lead arguments with them. I don't know what's "odd" about that. Best,

    B.

    ReplyDelete
  26. Hi Bee,

    Thanks for this paper of Garay you pointed out to Neil, for it does appear to be a very good synopsis of the whole subject. Perhaps from here I might get a better handle on the concept and arguments in general.

    Best,

    Phil

    ReplyDelete
  27. Hi Bee,

    so the theory (Quantum Gravity) of favoring a finite length should be a 'quantum theory' without an uncertainty relation ? It might be no problem that this theory has no uncertainty relation, but what is left of a quantum theory then ?

    Best Kay

    ReplyDelete
  28. Huh? I think you have completely misunderstood what I said. First, the uncertainty relation is not thought to be absent at high energies/small distances, but instead it should get larger, in such a way that there is no possibility to have an arbitrarily good resolution of structure which, in usual QM, you could in principle do if you only had high enough energies. Second, as I said explicitly in my post, this "generalized uncertainty" is not a property of the fundamental theory but some emerging leftover effect. You find this explained in my above mentioned earlier posts and in the papers I like to, eg mentioned paper by Garay is really nice and easily accessible.

    What the fundamental theory looks like, I don't know. You can speculate whether it's quantized or unquantized, I tend to believe it will turn out to be neither, but something entirely different. Best,

    B.

    ReplyDelete
  29. Hi Bee,

    I will read the paper. Thanks for clarifying.

    Best Kay

    ReplyDelete
  30. Have been thinking about your post Bee and some thoughts are being gathered here

    ReplyDelete
  31. Thanks Bee for reference and some explanations. By "odd", I meant the taking for granted of what makes a "single object." How connected or affiliated do m1, m2, .... have to be, for the quantum wavelength/s to be referenced to lambda = h/gamma*(m1, m2, ....)v separately, versus lambda = h/gamma*(m1 + m2 + m3 + ...)v as a sum?

    You refer to bound states which I have gathered is the essential answer: if the masses are "bound states" then they act as "one particle" for defining the DeBroglie lambda. But that doesn't tell me immediately what ensures or defines being "bound", what happens if they get less connected (what happens to their effective lambda), etc.

    I'm saying that to frame the issues, not expecting/needing anyone to hash it out here. I will look at the ref. and some things but put up some insights if you want.

    Finally, can Garay's supposition be compatible with there not being an upper energy for photons? After all, Garay implies no uncertainty can be less than the Planck length or similar. It seems to me, a photon wavelength less than that would allow finding lengths smaller in principle. So does that infamous super photon from GRB090510 cast doubt on the Garay thesis, or could it withstand no upper limit on photon energy?

    Finally, at the risk of making this too salady (is that a word?), I wonder if any insights about the gravity coming from light (there has to be for consistency) and relativistic transformation needs another look or can illuminate the issues. tx for your patience.

    ReplyDelete
  32. Neil,

    within the confines of non-relativistic QM, I think the quantity you are looking for is the 'centre of mass', call it X.

    Given a group of particles this is defined just as in the classical case, and represents the average position of the group.

    Differentiate this twice w.r.t time and multiply by the total mass (sum of individual masses) and you get the momentum of the X, call it P. P turns out to be just the sum of the individual momenta.

    Work out the commutator [X, P] and you get -ihbar, just as in the individual cases. So, the centre of mass can be treated in the same way as an individual particle, irrespective of any forces between the particles.

    But again, this is all basic non-relativistic QM.

    ReplyDelete
  33. James: On the risk of misinterpreting Neil, I think his question was in which case the Compton wavelength of a collection of particles is related to the total mass rather than each particle having its own.

    ReplyDelete
  34. Oh OK - I misunderstood the question.

    Unfortunately this other one is a bit trickier! I guess it depends on the circumstances. Presumably we have in mind some kind of scattering, and which picture is the better approximation is a question of scale, and the nature of the scattering interaction.

    Of course, if we have an n-particle system then the wavefunction doesn't exist in the "real world" but in R^3n, and if the particles are identical it is (anti)symmetrised across them. So the very idea of the “Compton wavelength” of one of the particles in “real” R^n becomes more delicate. So I suppose there is no fundamental answer or “cut off” here - it depends on the problem you are trying to solve.

    ReplyDelete
  35. Too many glasses of wine... of course it should have been: 'in "real" R^3'

    ReplyDelete
  36. James, Bee made a perfect little posing of my question - "in which case[s] the Compton wavelength of a collection of particles is related to the total mass rather than each particle having its own."
    I don't understand your answer re R^3n, I guess you mean some phase space but Compton wavelengths are "real" as per diffraction etc. BTW, look up "Schrodinger's virus".

    ReplyDelete
  37. Correction - I mean De Broglie wavelength, since the particle collection could be moving at any velocity (not just the 0.707 c that gives CW.)

    ReplyDelete
  38. Niel,

    I found this link which might be of interest concerning diffraction/scattering of atoms by gratings: http://arxiv.org/pdf/quant-ph/9905090v1

    Seems that it all depends on the situation. In general you have a quantum many-body problem. However, if the incident group of particles is bound sufficiently tightly so as to be much smaller than the grating spacing (such as is likely with an atom) then the group can be approximated as a single point particle at the centre of mass, to which we can associate a de Broglie wavelength.

    For larger more weakly bound systems this doesn't necessarily work and we have to deal with the individual particles themselves and consider other outcomes such as the molecule breaking up.

    As for Schroedinger's virus, I didn't see the point? It may be experimentally/morally feasible to perform (unlike with the poor cat) but how would it shed any light on the superposition problem?

    ReplyDelete
  39. James, it looks that the point of the Schrödinger's Virus experiment is to show that quantum superposition can be applied to objects as large as viruses - with a bit of added "organic" mystique. It helps to clear up where the boundary between the quantum and effectively macroscopic/classical world begins. (But "decoherence" arguments, IMHO, can't honestly do that - see my blog re.)

    http://news.softpedia.com/news/039-Schroedinger-039-s-Virus-039-Superposition-Experiment-Proposed-121444.shtml

    ReplyDelete
  40. Hi Bee!
    After reading so many opinions on minimum length, I still believe it to exist.

    The emphasis is on belief since it is a belief, as well, for those who do not believe in minimum length.

    Even though you have gone across the pond, I hope to keep seeing this blog.

    good luck!
    jal

    ReplyDelete
  41. Hi Neil,

    ... anyone notice the oddity of defining the Compton/de Broglie wavelength in terms of the momentum of a given "particle"? What if two particles are attached in various ways, do they count as "one" or "two" etc. - just like in Galileo's Socratic question challenging Aristotelian physics about the speed of a given falling body!

    If you connected two of Galileo's falling cannon balls, it wouldn't do much to their classical motion in a vacuum. But, in principle, it would affect their quantum-mechanical motion and where you may or may not find the find them.

    It's hard to do quantum interference experiments with cannon balls, but they have been done with beams of atoms and various not-so-small molecules, such as fullerenes (buckeyballs). It's the inverse of the total masses of connected particles, not their individual ones, that determines the de Broglie wavelength. For example:

    http://arxiv.org/abs/quant-ph/0309016

    John Wheeler used to talk about the wavelengths of baseballs. The wavelength of the orbiting Moon would be short, to say the least, but it has one in principle, where only certain orbits would be allowed. Incidentally, quantum interference has also been shown to hold experimentally for gravitation -- at least for beams of falling neutrons. See:

    http://prola.aps.org/abstract/PRL/v34/i23/p1472_1

    ReplyDelete
  42. Hi Neil, James & Chris,

    It appears superposition and entanglement is being looked at here as being one in the same. Superposition is when single quanta are considered as existing in more than one state simultaneously until observation, while entanglement is the interdependence of state between two or more quanta that exists regardless of spacial separation. The first is a question of our state of knowledge and the second the consequence of non locality.

    With the standard interpretation these are often confused, where experiments such as Wheeler's delayed choice seem to demonstrate the non localness of QM when they truly are not or need to be . That is one of many reasons why I prefer the deBroglie-Bohm pilot wave picture, since it avoids such confusion to have QM no more non local then it need be or has proven to be. In standard QM uncertainty serves as representing a limitation to reality, where in Bohmian Mechanics it marks merely our limitation to our ability in having a complete knowledge of reality.

    I think this common confusion in regards to QM is at the root of this proposed Schrödinger Virus experiment, as perhaps the researchers are considering the viruses themselves as being self aware and therefore acting as their own observers. In standard QM this could be considered as having an effect on outcome, where in Bohmian Mechanics it would render no difference regardless if a virus is self aware or not.

    Of course this is all headed off topic, since I don’t think minimum length is dependent on the consequence of observation or is it related to what is considered as the measurement problem in standard QM.


    Best,

    Phil

    ReplyDelete
  43. Hi Kris,

    Just to acknowledge my goof in refering to you as Chris instead as Kris as it should be; so sorry.

    Best,

    Phil

    ReplyDelete
  44. Thanks Kris, and your site is interesting (good for people in neuroscience to dabble in physics but we on the edges of the clique must tread carefully...) But I find the "inverse mass" (reduced mass?) curious - a nucleus surely has the DB wavelength according to its total mass! I am suspicious of any process (eg increasing boundness) having a trend first in one direction, then in the opposite direction.

    BTW this is relevant to minimal length, because the DB lambda should decrease with increasing mass of "an object." So we have to wonder if we can make a macroscopic body to be going very fast, and if the simple DB formula applies to the whole mass, then Compton wavelength goes below Planck length for all sorts of things - also, lighter things going very fast! So it is relevant not just about photons, but eg protons and nuclei going at 0.9999999... c. How do we work that into DSR for example?

    Note also that "bodies" are self-gravitating, and how does that affect their QM. However, photons can't be ("in their own reference frame") - but still have to exert gravity on things around them, to be consistent!

    There was a contentious discussion, I was part of, at Cosmic Variance touching on what gravity field surrounds a light beam. (Hint: there has to be, else I could get free energy: convert m1 to light say in a mirror box, then move m2 away, then reconvert to mass m1 and bring back m2 to get the work, etc.)

    Phil, I consider the QM issues unresolved. You can imagine a pilot wave guiding an electron along, and somehow allowing for interference patterns w/o "actually" spreading the electron out - but what about photons? They can't be rightly modeled as little points to begin with. The universe just can't be represented in realistic ways, it is IMHO like "The Matrix" and a program-like process for getting relative observations and experiences.

    BTW superposition isn't equivalent to entanglement, the latter is a smaller subset in which superpositions must follow certain rules relative to other ones.

    ReplyDelete
  45. Hi Neil,

    I would agree that the nature of the quanta still has a lot to be explained, for which new models are trying to be developed. However to say “the universe just can't be represented in realistic ways” is tantamount to saying there is no reality. I would also agree that to look at any particle as a dimensionless point in itself presents as being an impossibility, whether it be a photon or subluminal one.

    However, to insist that the reality of our world is observer dependant is something quite different when an interpretation exists that explains results of such experiments in a straight forward way that doesn’t require one to imagine a backward through time action as Wheeler’s delayed choice is often thought to confirm. Yes, nature has demonstrated having an undeniable non local nature, as it relates to entanglement, yet this doesn’t extend to the case of superposition by way of necessity as there being no other feasible option.

    So do I think the deBroglie-Bohm model represents being the final answer; not at all. Yet I do understand it will be found to be part of a future explanation, in part because it doesn’t require as many ad hoc additions/assumptions or requires that reality is dependant on observation. For me it is what J.S. Bell said by posting as a question:

    “Why is the pilot wave picture ignored in textbooks? Should it not be taught, not as the only way, but as an antidote to the prevailing complacency? To show that vagueness, subjectivity, and indeterminism are not forced on us by experimental facts, but by deliberate theoretical choice?”


    Best,


    Phil

    P,S. I never suggested that superposition is equivalent to entanglement, yet the opposite, which is why I pointed to the experiment by Aspect et al and the realted comment paper by Travis Norsen, which points out their confusion that superposition as found in the Wheeler delayed choice experiment is not an example of entanglement. That is it is not resultant of nature being nonlocal, a backward through time action or an observer reliant reality.

    ReplyDelete
  46. How would one push back perspective toward "minimal length?"

    Entanglement entropy sets up a thought process about how we see "geometrically" once one assumes one has seen in ways that a Q<->Q measures allows.

    You blanket "the measure of reality" by seeing "in resulting Lagrangian," as a function not only of "false vacuum to true," but of measures seen in relation to "moon measures" and laser light returns.

    Various string-motivated theories, quintessence, and other alternatives to General Relativity almost all predict a violation of the Equivalence Principle at some subtle level. Given the recent hints that there may be some new and mysterious modification to the laws of large-scale gravitational attraction (as indicated by supernovae and cosmic background anisotropies), it is important that we probe every available aspect of the basic nature of gravity.see:Apollo

    Liked the word verification- "toenut:)"

    ReplyDelete
  47. Phil,

    Indeed you didn't confuse superpo and entanglement. I can be careless in responding to mixed players. I just put out my standard quick take on the difference because, as you noted, "It appears superposition and entanglement is being looked at here as being one in the same." Your explanation is more artful anyway.

    As for the Wheeler delayed choice experiment: I'm not sure what JAW thought of it. My take is: most physicists consider the particle/photon as spreading out in a wave function until one of the available detectors shows it to be "right here", and then it can't be spread out anymore. Interference is just to "show" that the WF was in both paths. The WF should be in both anyway, until a detector rolls it up.

    So the recombiner isn't that crucial to "what's there." Without it, the WF should still be spread out, there just isn't interference between parts of the WF to ensure a certain pattern at the detectors per multiple shots.

    Hence, I don't like the saying, "without the second splitter(/recombiner), we can see which way [implies single path] the photon went." No. With no BS2, the wave stayed in both paths, "illuminated" both detectors, and "picked" one or the other to ping. Seriously, think: Imagine a photon emitted from an atom at the center of a hollow sphere. The WF expands "as a spherical shell" (well, really, as any atomic physicist says, it's more of a donut) until it hits the sphere, then "ping" somewhere. That isn't imagined to show, that because of no interference being had, that the photon went like a bullet out of that atom!

    The first BS in the WDCE is a sort of misdirection (no pun intended), it just splits the WF but makes no difference in principle to being like the WF in the sphere. (I think all this confuses and misdirects decoherence enthusiasts.)

    Pilot wave still looks weird to me, and how can it carry entanglement? Bell admitted it still couldn’t express local realism, so what kind of realistic wave theory is that? And atomic physics expresses the WF emitted by de-exciting atoms as being isomorphic to the antenna field of the oscillating electron clouds, let’s attend to the creation process, not just the propagation of things.

    I add, that we need for the photon to be literally split by a BS. Why? In cases where we differentially polarize the legs, the recombination needs to show the superposition of both polarization states. That also supports my interpretation of the WDCE.

    Those who are curious about PW, can check http://en.wikipedia.org/wiki/Pilot_wave.

    ReplyDelete
  48. Hi Neil,

    “That also supports my interpretation of the WDCE.”

    As you may or not be aware, the concept of decoherence is actually an integral part of Bohmian Mechanics and was initially brought forth in the first of Bohm’s two landmark 1952 papers; although it was not actually called to be that. However, it is considered here in a way that may satisfy some of your objections with the concept in general. As for other such questions, as we are both aware this is not the post or the forum for such matters. However, if you take your questions/objections to the Bell_Bohm forum I can almost assure you someone much better qualified than myself would be happy to address them.

    Best,

    Phil

    Hi Bee,

    I hope with the link I’ve provided to Neil I haven’t crossed a line. My intention is not to promote any theory of my own (as I don’t have one), rather simply give direction to Neil as to where a more appropriate place might be to address any questions he might have.

    ReplyDelete
  49. One just had to know "where to begin" for minimum length to be considered.

    In a "condense matter theorist point of view" applicability can be used, while in "thought experiment previously mentioned" the proposal had a sound basis to it, that "can be moved forward."

    By applying the correspondence to the situation where a black hole vibrates when an electron falls into it, they arrived at the description of electrons that move in and out of a quantum-critical state. See:Physical Reality Of String Theory Shown In Quantum-critical State Of Electrons

    Best,

    ReplyDelete
  50. Thanks Phil, and yes we should focus (as it were) on the direct, minimal length question. QM interpretation issues do matter though, because "it matters" (can't quit punning) for ML, whether a particle is "really" characterized by a de Broglie wavelength, or whether that's just a pilot wave for a little "chunk" that stays the same size (I may be misinterpreting PW, but that's what I "come away with.") My concern may matter, what happens when aggregates are subject to the wave equation etc.

    ReplyDelete

COMMENTS ON THIS BLOG ARE PERMANENTLY CLOSED. You can join the discussion on Patreon.

Note: Only a member of this blog may post a comment.