The typical experiment for Bell's theorem makes use of a pair of photons (electrons), entangled in polarization (spin). The two particles are send in different directions and their polarizations are measured along different directions. The correlation among the pairs of repeated measurements is subject to Bell's inequality. (Or the more general CHSH inequality).
The maybe most obvious loophole, called the locality loophole, is that information could be locally communicated from one measurement to the other. Since information can maximally be transmitted by the speed of light this is the case if, for example, the second measurement is made with delay to the first, such that the second measurement is in the forward lightcone of the first. Another loophole is that the detector settings may possibly be correlated with the prepared state without any violations of locality if they are in the forward lightcone of the preparation. Since in this case the experimenter cannot actually set the detector as he wishes, it's called the freedom-of-choice loophole.
A case where both loopholes are present is depicted in the space-time image below. The event marked with "E" is the emission of the photos. The red lines are the worldlines of the entangled electrons or photons (in an optical fiber). "A" and "B" are the two measurements and "a" and "b" are the events at which the detector settings are chosen. Also in the image are the forward lightcones of the event "E" and "A".
So that's how you don't want to make your experiment if you're aiming to disprove locally realistic hidden variables. Instead, what you want to do is an experiment as in the second figure below, where not only the measurement events "A" and "B" are spacelike to each other (ie they are not in each other's lightcone), but also the events "a" and "b" at which the detector settings are chosen are spacelike to each other and to the emission of the photons.
Let us also recall that the lightcone is invariant under Lorentz-transformations and thus the statement whether two events are spacelike, timelike or lightlike to each other does not depend on the reference frame. If you manage to do it in one frame, it's good for all frames.
Looks simple enough in a diagram, less simple to actually do it: Entanglement is a fragile state and the speed of light, which is the maximum speed by which (hidden) information might travel is really, really fast. It helps if you let the entangled particles travel over long distances before you make the measurement, but then you have to be very careful in getting the timing right.
And that's exactly what a group of experimentalists around Anton Zeiliger did and published in November in their paper "Violation of local realism with freedom of choice" (arXiv version here). They closed for the first time both of the two above mentioned loopholes by choosing a setting that disabled communication between the measurement events as well as between the preparation of the photons and the choice of detector settings. The test was performed between two Canary Islands, La Palma and Tenerife.
[Image Source: Lonely Planet]
The polarization-entangled pairs of photons were produced in La Palma. One was guided to a transmitter telescope and sent over a distance of 144 km to Tenerife, where it was received by another telescope. The other photon made 6km of circles in a coiled optical fibre in La Palma. The detector settings in La Palma were chosen by a quantum random number generator 1.2 km away from the source, and in Tenerife by another similar but independent random number generator. The measurements violated Bell's inequality by more than 16 standard deviations.
What a beautiful experiment!
But if you're a believer in local realistic hidden variable theories, let me scratch your itch. You can't close the freedom-of-choice loophole in superdeterministic hidden variables theories with this method because there's no true randomness in that case. It doesn't matter where you locate your "random" generator, its outcome was determined arbitrarily long ago in the backward lightcone of the emission.
So that's how you don't want to make your experiment if you're aiming to disprove locally realistic hidden variables. Instead, what you want to do is an experiment as in the second figure below, where not only the measurement events "A" and "B" are spacelike to each other (ie they are not in each other's lightcone), but also the events "a" and "b" at which the detector settings are chosen are spacelike to each other and to the emission of the photons.
Let us also recall that the lightcone is invariant under Lorentz-transformations and thus the statement whether two events are spacelike, timelike or lightlike to each other does not depend on the reference frame. If you manage to do it in one frame, it's good for all frames.
Looks simple enough in a diagram, less simple to actually do it: Entanglement is a fragile state and the speed of light, which is the maximum speed by which (hidden) information might travel is really, really fast. It helps if you let the entangled particles travel over long distances before you make the measurement, but then you have to be very careful in getting the timing right.
And that's exactly what a group of experimentalists around Anton Zeiliger did and published in November in their paper "Violation of local realism with freedom of choice" (arXiv version here). They closed for the first time both of the two above mentioned loopholes by choosing a setting that disabled communication between the measurement events as well as between the preparation of the photons and the choice of detector settings. The test was performed between two Canary Islands, La Palma and Tenerife.
The polarization-entangled pairs of photons were produced in La Palma. One was guided to a transmitter telescope and sent over a distance of 144 km to Tenerife, where it was received by another telescope. The other photon made 6km of circles in a coiled optical fibre in La Palma. The detector settings in La Palma were chosen by a quantum random number generator 1.2 km away from the source, and in Tenerife by another similar but independent random number generator. The measurements violated Bell's inequality by more than 16 standard deviations.
What a beautiful experiment!
But if you're a believer in local realistic hidden variable theories, let me scratch your itch. You can't close the freedom-of-choice loophole in superdeterministic hidden variables theories with this method because there's no true randomness in that case. It doesn't matter where you locate your "random" generator, its outcome was determined arbitrarily long ago in the backward lightcone of the emission.
No Spooky action at a Distance?:)
ReplyDeleteI spoke of Penrose's Quanglement before. Cryptography? Spin in relation to another instantaneously? Seth Loylld?
Susskind's thought experiment about elephants excursions as a wayin which to describe what was going inside the blackhole? I mean it was a shot in the dark right, in using entanglement?
If you've followed entanglement, John Clauser, Michael Horne, Abner Shimony and Richard Holt, you would have most surely meet up with Zeilinger and his experiments.
I am sure Phil will be all over this:)
Best,
Itch duly scratched. I note that there doesn't have to be superdeterminism to retain a modicum of classicality, there just has to be "superrandomness". That is, probability distributions in the present, in particular those associated with the random number generators, have to be increasingly determined as we are given more information about the past. The classical side of Bell inequalities arguments are statistical-probabilistic (and similarly for CHSH etc.). The dependence on initial conditions is such that a lot of information is requires to determine useful correlations.
ReplyDeleteI can't remember from your past posts whether you take the violations to indicate a failure more of locality or more of particle property realism?
Hi Peter,
ReplyDeleteYes, that's correct. I just didn't want to elaborate on the point for the sake of readability. The paper I mentioned above has a very nice explanation on that point too.
I'm not much invested into realism, I have no big problems with wavefunctions and their relatives. Best,
B.
/*..What a beautiful experiment!..*/
ReplyDeleteYep, they did a nice vacation for many people involved.
http://upload.wikimedia.org/wikipedia/commons/4/43/Caldera_de_Taburiente_La_Palma.jpg
I'm not sure, it's the cheapest way of scientific experiments...
Hi Plato,
ReplyDeleteNothing spooky here, no, though that depends on one's perspective I guess. It is interesting in the historical perspective, I can imagine how weird all that must have seemed a century ago and how used we're now to quantum reality thanks to the many amazing experiments that have been made. Best,
B.
Image of Bell Analyzer
ReplyDeleteWhat the teams at the University of Innsbruck and the US National Institute of Standards and Technology (Nist) did was teleport qubits from one atom to another with the help of a third auxiliary atom.
It relies on a strange behaviour that exists at the atomic scale known as "entanglement", whereby two particles can have related properties even when they are far apart. Einstein called it a "spooky action". See: Atomic Dance
Sorry to Seth Lloyd for spelling his name wrong.
ReplyDeleteWhere Susskind leaves off, Seth Lloyd begins
Yeah, it's weird and I accept it as a mystery. Some of the MWI people say, no big deal because all the possible results happen anyway and somehow match up (the most gung-ho attempt to defend MWI I found is "Schroedinger's Rabbits", talking about tendrils matching up - sure, a metaphor but I can't see it through.) Then of course there's the problem that without tricks, the branching outcomes lead to 50/50 chances because both continue to be real. (As I say, the amplitudes of the states "go to waste.")
ReplyDeleteOne of my gripes is the view in MWI and associated DI (decoherence makes for effective collapse) is their loose and over extended application of "entangled." IMHO real entanglement is such as genuinely correlated ("biphotons") with a tandem wave function description such that neither one has e.g. a specific polarization state by itself. Yet in DI, people loosely say that e.g. outcome #1 from a superposition (like, *one* photon split by BS) is "entangled" with the observer seeing that outcome, and outcome #2 is entangled with observer seeing that click, etc.
But isn't that just ordinary unitary evolution applied to the WF component states? The two cases aren't even written up the same way and have different operator relations (I'll pass to trying to enter that here.) It's not "real entanglement" which is correlation between two actually realized events (well, as we imagine it in the normal "one real world.") Also in "real entanglement" one measurement just sets up requirement about the state of the partner, not the issue of being observed there at all.
Well, you know I'm going to knock on DI but I did come up with a way to experimentally test "decoherence turns superpositions into mixtures" and "explains why we don't see macroscopic alive + dead cat." If their explanation fails (as I expect) then that knocks down MWI pretty hard - and nonlocal correlations continue to be a real paradox.
The measurements violated Bell's inequality by more than 16 standard deviations.
ReplyDeleteThe US has a fission waste storage problem. Yucca Mountain is a giant pile of ion-exchange aluminosilicate sitting 1400 feet above the local water table. Nevada had 900+ underground nuclear explosions, offering some information on uncontained fission waste migration in random geology. The original specification was 1000 years of safe containment. That has now increased to 200,000 years of safe containment. Yucca Mountain is Officially unsafe and cannot be used!
What evidence does the hidden variable crowd offer that it happens, as opposed to 19 sigma that it doesn't?
This comment has been removed by the author.
ReplyDeleteThis comment has been removed by the author.
ReplyDeleteHi Bee,
ReplyDeleteThanks for this great post explaining Zeiliger et al latest confirmation of J.S. Bell’s experimental inspired hypothesis. The only problem I have with it, as well as your description, is for the unfamiliar it might have them to believe all “hidden variable” explanations with such violations are to be dismissed. That is intrinsically non local hidden variables theories are also excluded by such results, while actually it could be claimed they are reinforced and resultant merely as quantum behaviour is then to be chosen as nature having an undeniable non local character; only one whose existence is denied in the standard interpretational treatment.
Also this issue of the added necessity that counterfactual definiteness must also be denied is something very subtle and still not fully fathomed, as some would argue that that the non local character of reality is all that’s required as to ask what is the meaningfulness of counterfactual definiteness regarding our notions pertaining to reality within the quantum mechanical backdrop. So while I would agree that the noose has been further tightened, yet I’m still left to wonder around who’s neck of those who believe they understand the implications for reality in respect to quantum mechanics;-)
“For me then this is the real problem with quantum theory: the apparently essential conflict between any sharp formulation and fundamental relativity. That is to say, we have an apparent incompatibility, at the deepest level, between the two fundamental pillars of contemporary theory... and of our meeting. I am glad therefore that in some of the sessions we will stand back from the impressive technical details of current progress to review this strange situation. It may be that a real synthesis of quantum mechanics and relativity requires not just technical developments but radical conceptual renewal.”
-J. S. Bell, “Speakable and Unspeakable in Quantum Mechanics” page 172, (First Edition), Cambridge University Press, 1993
“And hence it emerges that the CHSH inequality in particular follows from locality alone, such that its empirical violation can only be blamed on the non-local character of nature”
-Travis Norsen, “Counter-Factual Meaningfulness and the Bell and CHSH Inequalities”, arXiv:quant-ph/0606084
Best,
Phil
Hi Phil,
ReplyDeleteWell, yes, as I wrote in the 2nd sentence it's for local realistic hidden variables theories, and as I explained later, the non-superdeterministic type in addition. Either way, the point is basically that something has to give. Best,
B.
Hi Bee,
ReplyDeleteI’m sorry and I hope you don’t think that I meant you to have such to be excluded. Rather it’s just I’ve found it the nature of many to draw conclusions where none should be made. The thing is I think the best physicists are those who would rather admit their uncertainties and then grapple with them, as to find ways by which they may be relieved. That is in such regard I’ve never found Anton Zeiliger to be such a physicist and thus find his motives should be carefully examined when in come to his conclusions respective of his findings.
“People who jump to conclusions rarely alight on them.”
-Philip Guedalla
Best,
Phil
Maybe you know Joy Christian from Perimeter who has some (unpublished) papers on similar topics. Comment?
ReplyDeleteI found Bell's inequalities not really that transparant. The way Mermin set up the experiment in this paper is much more clear: http://ajp.aapt.org/resource/1/ajpias/v58/i8/p731_s1 . It uses three detectors, but is much more direct in its claims. I wonder if it's experimentally possible to test this version.
ReplyDeleteBell's inequality does not trouble me much. But I don't understand the experiment that was done here:
ReplyDeletehttp://www.fortunecity.com/emachines/e11/86/qphil.html
I'm talking about the one with two down converters. The problem is the interference pattern is a locally visible thing. That seems to suggest that you can transmit information between two points even if no particle travels between the points.
Phil, anyone: I suppose BM just takes what it already has for quantum phenomena in general - a sort of superluminal communication - and extends that to specific "entanglement." Note however (and note also what I said earlier about confusion over "entanglement") that the polarization states of entangled photon pairs are supposed to be indefinite in principle. It isn't just, one has a real value in itself that affects the real value of the other one. That's the whole point, it isn't local realism but a relationship between "measurements." So I don't see that something with real properties sending a message to something else would be good enough. I'm sure people thought about that, it just seems inadequate to the task.
ReplyDeleteAlso, if we want photons to be "real particles" in some sense but just guided around slits and screens etc., note that in QFT we don't always have a definite photon number. How would BM handle that? I don't see how, it's a feature of field theories that the field manifests in various ways and no particular set of real particles, no matter how cleverly manipulated, can represent the situation.
Hi Phil,
ReplyDeleteNo need to be sorry, good you pointed it out. Best,
B.
This comment has been removed by the author.
ReplyDeleteHi Phillip,
ReplyDeleteYes, I know Joy. I haven't come around to read his recent papers though, so have nothing to say at this point, sorry. Best,
B.
I think what Phil is trying to say is that just as Special Relativity shows that an aether needn't exist,and Bell's Inequality Tests show that hidden variables are not necessary, that is not to say that either exists at a more fundamental level, that is to say at a much smaller length scale that we can currently probe, and that either the LHC or VLHC can probe.
ReplyDeleteLonger sentences available on request. :-)
The setting for this test was quite beautiful. Neil Bates, you should include some money in your budget to run the experiment in a beautiful setting. Take you pick: Aruba, Jamaica, come on Neil hey let's go to Bermuda, Bahama, come on anti-Deco or Key Largo, Montego, what the heck let's take it straight to Kokomo, we'll run this thing and then we'll take it slow .... ok I'll stop that now. I got my daily Beach Boys fix, thnx.
Hi Steven,
ReplyDeleteBell's inequality doesn't say that hidden variables are not necessary, it says hidden variables theories (local, realistic) have properties not in agreement with experiment. Best,
B.
This comment has been removed by the author.
ReplyDeleteSrsly? How far has deBroglie-Bohm Pilot Wave Theory advanced since Bohm worked on it?
ReplyDeleteI don't think any of these theories are complete. I do think Bell's Inequalities Test experiiments offer a huge clue that we're missing something fundamental.
At least Peter Woit would say I'm sure: And there you go: quantum mechanics proven correct, yet again.
Given the new and quite remarkable "weak measurement" results regarding the two-slit experiments (Kocsis et al, Science, 6/3/11) and the wavefunction (Nature, 6/9/11, I think),one would want to warm up frozen assumptions and stay fluid ;) Plasmatic?
ReplyDeleteThen there is J. Christian's research that indicates fatal flaws in Bell's Theorem.
Then there is K. Whartons's reseach that indicates that discreteness only enters QM via measurements (boundary conditions?).
Don't discard your local realism bets just yet. Very much in flux.
And how about them BOSTON BRUINS!
What an amazing Stanley Cup Series!
Haven't the New Jersey Devils (and yes, we are ... little ones, like children) won the SC at least twice since the Bruins won last? I remember Bobby Orr and Derek Sanderson, yes I'm that old. :-)
ReplyDeleteBack on topic! It's back on topic day:
From page 172 of Tony Hey's and Patrick Walters' "The New Quantum Universe" (2009):
" ... Most physicists now accept that quantum mechanics has passed this test. What do these experiments tell us about the nature of reality? The observed violation of Bell's inequality means that no hidden variables theory - without some explicit or implicit unpleasant action-at-a-distance property - can agree with experiment. Whilst Einstein would probably have preferred some underlying, deterministic hidden variable explanation for quantum mechanics, he would certainly not have wanted to accept the existence of such 'spooky action-at-a-distance' effects."
Emphasis mine. To repeat:
- without some explicit or implicit unpleasant action-at-a-distance property -
M'kay. Well, "unpleasant" is subjective, and has no room in Math or Science IMO. So what explicit properties or implicit properties are left to ponder?
Hey was a protege of Feynman's.
Steve, AAAD is unpleasant to those who think the universe "ought not to misbehave" in ways that affront their stuffy classical-rationalist sensibilities. I'm completely the opposite, I think the U can be as it pleases, it pleases to outwit and baffle us, and I (and probably it) revel in its weirdness.
ReplyDeleteTo me the most pitiful development is the attempt to maintain old-fashioned classical particles and fields by in MWI. These people say, there really is a specific real supeposition of amplitudes all over, and it evolves deterministically. How come the physical effects of the alternatives don't smash together, how to get the right probabilities (there really aren't any, if they are right), how to rematerialize the same mass-energy in more than one place, all an absurd contortion to keep the onerous clockwork universe alive (yeah, deterministic Schrodinger evolution is clockwork in principle) - but most people think that Rube Goldberg contraption is the hippest, cutting edge in thought.
Speaking of Stanley Cup: Phil, if you're out there, say it ain't so! I thought you guys were b-o-r-i-n-g... (You're not boring, but in an OK way to not be ...)
Hi All,
ReplyDeleteSome how many are under the impression that I’m a die hard Bohmian. The fact of the matter is I’m not a die hard anything. What I am is someone who is fully aware that there is no standard interpretation of quantum mechanics which has observations explained which avoids being vague, ambiguous and yes incomplete. As an example decoherence is the positivists attempt to have this solved for the standard treatment of QM.
That is for instance Richard Feynman was to famously proclaim no one understands quantum mechanics, which is something I’ve always had issue with. That is if Quantum mechanics is limited strictly to being a non relativistic theory as Bell pointed out the deBroglie-Bohm theory does do just that. On the other hand if he had said no one understands Quantum Field Theory I would have to agree, as neither Bohmian Mechanics or any other interpretation can be used to offer a complete description. From the Bohmian perspective this is because having it becoming a completely Lorentz invariant theory has this as a problem ( one which continues to be worked on ) and for the standard treatment it’s because the interpretational foundation upon which it realizes remains vague, ambiguous and yes incomplete.
The beauty of such experiments is that it is narrowing the options. This has my first question to anyone which holds that this result being a vindication of the standard treatment is explain, not to me, yet rather to themselves what the standard treatment is then actually telling them about the world which they claim it has them truly understand.
“There is no quantum world. There is only an abstract physical description. It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature”
-Neils Bohr
"The Heisenberg-Bohr tranquilizing philosophy--or religion?--is delicately contrived that, for the time being, it provides a gentle pillow for the true believer from which he cannot very easily be aroused. So let him lie there."
-Albert Einstein, letter to Erwin Schroedinger (May 31, 1938) .”
Best,
Phil
This comment has been removed by the author.
ReplyDeleteThis comment has been removed by the author.
ReplyDeleteWe're not talking MWI if Bryce DeWitt doesn't resurrect crazy Everett III's paper from the trash heap in the late 60's. Also be careful, Deustch has a slightly different form of MWI where the number of worlds are not infinite but very very large.
ReplyDeleteWell whatever form of MWI we talk about I tend to think it's all garbage.
"The Collapse of the Wavefunction." <== What is that? Nobody knows.
Penrose thinks microgravity does it. Wojeck thinks the environment, the measuring apparatus, AND the observed interact and decohere. And there are about 4 other theories, all but one of which is wrong, or maybe all of them are wrong.
Copenhagen is the current best one IMO but it's weak as hell and i liked how J.S. Bell mocked Complementarity as "Contrariness", but, shrug, what can you do? In the future, we'll know more, said Lee Smolin. He also predicted a theory of Quantum Gravity in 2012. How's that coming along? Are we on track? Are we there yet (says every liyyle kid, about 5 minutes into a day trip)
Here's one I'm 99.9999999 % sure is wrong: von Neumann and Wigner's "The consciousness of the observer determines the outcome." Holy geez why do people put up with THAT nonsense?!
First, I remind readers that Leggett inequalities (see also Leggett–Garg inequality) are important too, not just Bell inequalities (that got the most press.) I note with pride that I trade up and down with Leggett in search for "quantum measurement paradox" (since I haven't bragged of that for awhile, you were due a checkup.)
ReplyDeleteRe MWI: always a problem, for them to get "statistics" out of continued many worlds. Many say it is a particularly bad problem to get the right probabilities for correlations, but supporters say it's easier because no superluminal communication is necessary: just get the right "worlds" to match up (I don't see that as much easier.) Just remember: in MWI every equivalent experiment really turns out the same each time because it consists of the continued (but "non-interacting") superposition of all amplitudes/outcomes.
They can't really derive correct chances from "either I'm in one branch, or the other" so sophistry must be invoked. How can they get more (especially, "infinite") "worlds" than the branches, which was bad enough already? ...
Heh, Everett: In Wikipedia we see the quote: Léon Rosenfeld, one of Bohr's devotees, talking about Everett's visit, described Everett as being "undescribably [sic] stupid and could not understand the simplest things in quantum mechanics". Well, none of us can really understand what happens there, and I say we have no right to expect we can (we might someday, no *right* to it.)
The consciousness thing, yeah I don't agree with that either, and MWI co-opted it as the alternative brain states don't interact with each other. But when are those people going to realize that lack of ability to interfere (see my post on the confusion thereby) isn't going to keep e.g. two split versions of trains from smashing into each other.
PS: Steve, your fastidious attention to detail - revealed in my email fwds as correction from Everett II to Everett III - is a good trait for anyone helping to get an experiment done. (We're working on testing whether decoherence can turn a superposition into a mixture, see more at name link.) BTW, use preview function ...
Since the orthodoxy is that in a unified theory including gravity, there are no local observables - diffeomorphism invariance washes them away -
ReplyDeletea. Does Bell's theorem translate into some theorem about the S-matrix?
b. Maybe Bell's theorem is an indication of some fundamental limitation on the construction of approximate local observables in the unified theory.
BTW Steve, I mean "fastidious" in the good sense. Speaking of experiments, anyone know of other experiments, done or proposed, to test decoherence of photons for status as mixture versus superposition? What I've seen is usually trapped atoms, atoms in superposed energy and/or location states, etc. AFAICT decoherence isn't part of the issue of nonlocal correlations, why such a big deal then about "measurement" in general?
ReplyDeleteHi Arun,
ReplyDeleteExcellent question. One of the things that itches in my back when I hear foundations of quantum mechanics is that we're actually doing qft these days. I don't know the answer, but would really like to know. Best,
B.
There's a quantum gravity conference going on now I believe in Zurich, or is it just over? Any revelations?
ReplyDeleteI was a bit surprised last summer when Marcelo Gleiser said there may be no T.O.E. Doesn't take much to extend that to no such thing as Quantum Gravity, and no such thing as Grand Unified theory, so QFT may be the best we all get.
Kinda depressing if so, in which case ... let's all quit science and play the markets and get sticking rich !
:-)
Hi Steven,
ReplyDeleteYes, I recall having seen a poster from the Zurich event somewhere at the Institute, but don't know anybody who went there. Best,
B.
Carlo Rovelli, for starters. You know him, yes? Or how do you say it - Jah ? :)
ReplyDeleteHi there,
ReplyDeleteA local real model of a photon begin used in the Dehlinger and Mitchell setup to illustrate entangled photons.
http://www.animatedphysics.com/photons/bells_inequality.htm
The linear photon is modelled as not only having a specific "average" polarization direction (as is modelled by Dehlinger and Mitchell), but also as having a "wobble" or "instantaneous" polarization direction.
Lots of animations and pictures to illustrate the model.
Regards,
Ed