Fluid art by Vera de Gernier. |

Hawking notably was first to derive that black holes are not entirely black, but must emit what is now called “Hawking radiation”. The temperature of this radiation is inversely proportional to the mass of the black hole, a relation that has not been experimentally confirmed, so far.

Since the known black holes out there in the universe are very massive, their temperature is too small to be measurable. For this reason, physicists have begun to test Hawking’s predictions by simulating black holes in the laboratory using superfluids, that are fluids at a few degrees above absolute zero which have almost no viscosity. If a superfluid has regions where it flows faster than the speed of sound in the fluid, then sound waves cannot escape the fast-flowing part of the fluid. This is similar to how light cannot escape from a black hole.

The resemblance between the two cases more than just a verbal analogy, as was shown first by Bill Unruh in the 1980s: The mathematics of the two situations is identical. Therefore, physicists should be able to use the superfluid to measure the properties of the radiation predicted by Hawking because his calculation applies for these fluids too.

Checking Hawking’s predictions is what Jeff Steinhauer and his group at Technion in Israel are doing. They use a cloud of about 8000 Rubidium atoms at a temperature so low that the atoms form a Bose-Einstein Condensate and become superfluid. They then use lasers to confine the cloud and to change the number density in some part of it. Changing the number density will also change the speed of sound, and hence create a “sonic horizon”.

Number density (top) and velocity (bottom) of the superfluid. The drop in the middle simulates the sonic horizon. Figure 2 from arXiv:1809.00913 |

Using this method, Steinhauer’s group already showed some years ago that, yes, the fluid black hole emits radiation and this radiation is entangled across the horizon, as Hawking predicted. They measured this by recording density fluctuations in the cloud and then demonstrated that these fluctuations on opposite sides of the horizon are correlated.

Three weeks ago, Steinhauer’s group reported results from a new experiment in which they have now measured the temperature of the fluid black hole:

**Observation of thermal Hawking radiation at the Hawking temperature in an analogue black hole**

Juan Ramón Muñoz de Nova, Katrine Golubkov, Victor I. Kolobov, Jeff Steinhauer

arXiv:1809.00913 [gr-qc]

The authors also point out in the paper that they see no evidence of a black hole firewall. A black hole firewall would have been conflict with Hawking’s prediction according to which radiation from the black hole does not carry information.

In 2012, a group of researchers from UCSB argued that preserving information would necessitate a barrier of highly energetic particles – the “firewall” – at the black hole horizon. Their argument is wrong: I demonstrated that it is very well possible to preserve information without creating a firewall. The original proof contains a mistake. Nevertheless, the firewall issue has arguably attracted a lot of attention. The new experiment shows that the fluid black holes obey Hawking’s prediction, and no firewall appears.

Of course the fluid black hole does not reproduce the mathematics of real black hole entirely. Most importantly, the emission of radiation does not reduce the mass of the black hole, as it should if the radiation would carry away energy. This is the lack of “backreaction” (which this blog is named after). Note, however, that Hawking’s calculation also neglects backreaction. So for what the premises of Hawking’s calculation are concerned, fluid analogies should work fine.

The fluid analogies for black holes also differ from real black holes also because they have a different symmetry (it’s a linear system, a line basically, rather than a sphere) and they have a finite size. You may complain that’s a rather unrealistic case, and I would agree. But I think that makes them more, not less, interesting. That’s because these fluids really simulate lower-dimensional black holes in a box. And this is exactly the case for which string theorists claim they can calculate what happens using what’s known as the AdS/CFT correspondence.

Now, if the string theory calculations were correct then the information should leak out of the black hole. If you want to avoid a black hole firewall – because that hasn’t been observed – you need to break the entanglement across the horizon. But this isn’t compatible with the earlier results of Steinhauer’s group.

So, this result documents that black holes in a box do not behave like string theorists think they should. Of course the current measurement results have large uncertainties and will have to be independently reproduced before the case can be considered settled. But I have little doubt the results of the Steinhauer group will hold up. And I’ll be curious to hear what string theorists say about this.

## 67 comments:

Unruh showed the Hawing effect also with "just" water.

One of his postdocs (S.Weinfurtner, also like you a very good german woman physicist)

continued the experiments in UK. What do you think about them?

Or there is a basic reason why superfluids are better than water?

Or they can just "simulate" different spacetimes/situations?

Thanks,

K.

String theory is a Cauchy horizon. Its unlimited number of "acceptable" vacua have no empirical connection. Everything is explained, nothing is solved, publish. Information?

Blockchained. My consulting fee is to be denominated in bayesian-inferred h-index. "8^>)

Thermodynamics proposes, kinetics imposes. Physical theory, like economics, assumes equilibrium. The world is driven by rate, affording metastable conclusions.

Thank you for the clarification!

Kay,

They looked at the classical case. Ie, another layer of analogy. Hawking's effect is about quantum fields. So the Steinhauer et al experiment brings you closer to the real thing.

Sabine,

you write

„… this radiation is entangled across the horizon, as Hawking predicted. They measured this by recording density fluctuations in the cloud and then demonstrated that these fluctuations on opposite sides of the horizon are correlated.”“If you want to avoid a black hole firewall – because that hasn’t been observed – you need to break the entanglement across the horizon.”

Is the following analogy with an EPR measurement correct?

The density fluctuations on opposite sides of the horizon are correlated just like the measurement results in an EPR experiment. Thus, as in an EPR measurement the unitary evolution of an entangled state was interrupted by a measurement, i.e. the entangled state collapsed and the “information” (about this entangled state and its unitary evolution) is lost.

Info:

There is a typo in your link arXiv:1809.00913 [gr-qc].

Interesting stuff.

sean s.

A cynical part of me thinks that this experiment does nothing other than check some people's math. Is there more to the result than this? What would a wrong-temperature result have implied?

I do not agree with this experiment. I think what they measured are phonons, the quantized unit of heat. Of course there’s a corresponding temperature for phonons. But Hawking radiation involves photons. The Schwarzschild radius is a gravitational barrier. Photons cannot cross the barrier but it is possible by quantum tunnelling. A better analogue is alpha decay since this involves quantum tunnelling of alpha particles to escape the Coulomb barrier of atomic nuclei. Black hole temperature is just a consequence of radiation flux. The main physical process is quantum tunnelling.

Topher,

Physics isn't math. Even if you have no doubt that the calculation is correct, you still have to check that it actually applies to nature. That's how it works in science. A wrong temperature would have implied that something is going on which our theories don't properly describe.

Enrico,

Of course, as I wrote, what's trapped in the fluid analogy is not light but sound. Or phonons, if you wish, same thing. Yes, you can describe that by tunneling if you want, but the result is the same.

Reimond,

It's almost correct. You don't need to make a measurement to lose the information because (so the story goes) the black hole evaporates and its inside is eventually gone.

As I understand, the equations of Hawking radiation are derived from Stefan-Boltzmann law and Planck black body radiation. They are testing the correlations of temperature, radiation and entropy which are already known in classical thermodynamics. What is unique about Hawking radiation is how the radiation escapes from an insurmountable barrier. This is quantum tunnelling that classical thermodynamics cannot explain. So experiments to prove Hawking radiation should focus on this unique process. Otherwise, they are just proving the already proven laws of classical thermodynamics.

I may be stupid but I don't understand how you can loose information if what's inside and outside the black-hole is entangled ? Wouldn't the information be precisely carried away by the Hawking radiation ? I get it that being thermal it looks like it carries no information what so ever but I don't see how this goes against more subtle entanglement arguments.

Those interested in connections between GR and other branches of physics might like to read this contribution by Gary Gibbons in the excellent proceedings of a conference in Prague a few years ago. (The proceedings are some of the best I've come across, something Lars Andersson agrees with.)

Enrico,

No, this is wrong. Please look at Hawking's papers.

Deus,

Because the black hole loses mass by evaporating, so the inside eventually disappears. This means you have to get out the information before that happens.

Somehow the link to the PDF file of Gibbons's contribution got garbled. Hopefully at least one of these is correct:

ae100prg.mff.cuni.cz/pdf_proceedings/Gibbons.pdf

http://ae100prg.mff.cuni.cz/pdf_proceedings/Gibbons.pdf

Dear Sabine,

Back in March you wrote an article for Quanta Magazine about how some groups looked for evidence of firewalls in the data from LIGO. As far as I can understand these things - which isn't very far - the LIGO data indicate that black hole firewalls do exist but these latest fluid analogue data indicate that they don't. OK, both sets of data seem to have quite a lot of uncertainty but is it plausible (or likely?) that the two processes should give such different answers, or is it more likely that one of the experiments is just plain wrong?

Thanks,

Adrian

Adrian,

I don't think either of the experiments is "wrong" but the statistical significance of the LIGO echoes is not very high. Therefore, the most likely development seems to me that the LIGO echoes will go away with further data. If they don't, and the results of the black hole analogue experiment are also confirmed, we'll have to think about why the analogy doesn't hold.

Let me also add a word of caution, which is that even if the LIGO echoes are confirmed the reason may not be a black hole firewall. In contrast to Steinhauer's lab measurements, which really get a hold on the correlations in the Hawking radiation, echoes from the horizon are an extremely indirect way of inferring what's going on with the Hawking radiation.

So there is finally a prediction of String Theory that can and has been tested? and the theory found wanting? How much of String Theory does that negate?

String theory gives rise to the AdS/CFT correspondence. This is not in and by itself "stringy" mathematics -- it's a limit in which the string effects are negligibly small. Nevertheless, string theorists use it to calculate what happens in the evaporation of black holes. As to current status, they claim that black holes must be able to leak information. This cannot be done if the radiation is entangled across the horizon.

Most ways to remove the entanglement result in a firewall which is (now) incompatible with experiment. It is possible to remove the entanglement without creating a firewall, but the earlier experiment shows that there is an entanglement across the horizon.

This means either way you turn it, it doesn't fit with the AdS/CFT prediction.

There are various ways to get out of the conundrum, for example you can claim that you cannot describe what goes on in a box in asymptotically flat case by using asymptotic de Sitter space. But string theorists have claimed that it is possible. Besides this, it is possible to also simulate the AdS space, though that hasn't yet been done. You could also just claim that for unknown reasons AdS/CFT doesn't apply to analog gravity, but there is not presently any justification for this claim.

Basically, the only coherent answer I can think of at this point would be that string theory doesn't say anything about real black holes.

Sabine and Topher,

I'm afraid Topher is right and Sabine wrong.

These analog experiments are the "analog" (in the technical computer-science sense) counterpart of digital computer simulations. They just reveal some consequences of a mathematical theory. These consequences may be nontrivial, but they still descend from the theory, and therefore tell us nothing about nature that is not already in the theory.

This fact is easily understood in the case of a computer simulation: the simulation tells us something about nature only as far as that something is already in the equations that underpin the simulations.

And for the analog experiments it's just the same. Imagine that physical systems A and B obey the same set of equations to some approximation (they are never exactly identical of course). Using these equations, I have deduced mathematically some consequence about system B. You perform experiments on system A and confirm my mathematical findings. What does this mean? It means that I have done my maths right, that's all!

And why is that? Because imagine that your experiment on A does not agree with my calculations on B. Would it mean that you have discovered some new physical property about system B? Not at all! It can only mean one of the following three things: (i) my calculations on system B are wrong; (ii) your experiment on system A is inaccurate; or (iii) the discrepancy is due to some higher-order effects for which the analogy between A and B does not hold.

Sabine (or others):

Forgive a basic question, but if I have the question other (non experts) will have it also, I have found.

The superfluid exists only near absolute zero.

The analog black hole forms only when areas are moving faster than the speed of sound.

I haven't calculated what the average velocity of a particle is near absolute zero, but my guess would be well below the speed of sound.

SO - if the speed of an area is increased to above the speed of sound, I would expect the temperature to rise and the superfluid to be destroyed.

Where is the error?

I would suggest the reason the firewall issue does not come up here is that these are open systems. A black hole emitting Hawking radiation is modeled as a closed system. The experimenter has to input energy, such as maintaining the fluid flow, which is a departure from the black hole.

Where does the firewall come from? It is due to the theoretical prediction entanglements of Hawking radiation with a black hole change from bipartite to tripartite entanglement. The reason is not that hard to see. A hot cavity will emit photons that are entangled with states of the atoms in the cavity. Once about half the energy is emitted the states in the atoms are emitted that are entangled with previously emitted radiation. So the entanglement entropy will increase to some maximum and then decrease. What about the black hole? A particle is emitted when its entanglement pair falls into the black hole. This entangles the Hawking radiation with the black hole. Eventually Hawking radiation that is emitted has a higher probability of being entangled with previously emitted Hawking radiation. However,.this old Hawking radiation that has been in a bipartite entanglement with the black hole is now in a tripartite entanglement with a second quantum of Hawking radiation and the black hole. This is a violation of the so called monogamy principle, which in general is that any type of entanglement symmetry is a constant of the quantum evolution. The firewall was proposed to prevent this by removing the equivalence principle at the horizon. In this way the horizon turns into a sort of singularity that demolishes the particle that falls into a black hole, thus preventing this entanglement symmetry violation.

What does though probably prevent the firewall is an open world. It also might be that the equivalence and unitarity principles are dual to each other. After all gravitation is Lorentzian with a noncompact group structure SO(3,1), and quantum mechanics works with compact symmetry groups. So maybe quantum information is more generally conserved through gravitation. The moduli space for gravitation is not Hausdorff in general and has sequences or orbits of gauge-like fields that are not bounded. There are connections between entanglement symmetries and moduli spaces as yet not known.

Amazing that the mathematics of two such diverse phenomenon is “identical!” Does this, as well as the existence of quantum fluid analogues, suggest that that there may be an underlying, as yet unrecognized, organizing principle at work? Otherwise, why would this be the case?

There are two versions of Hawking radiation. Version 1 involves real photons with wavelengths greater than the Schwarzschild radius. By the Uncertainty Principle, the photons are in superposition inside and outside the event horizon. This enables them to quantum tunnel and cross the gravitational barrier of the black hole. Version 2 involves pairs of virtual particles at the event horizon. One of two virtual particles has negative energy and falls inside the event horizon. When it meets a real particle inside the black hole, they mutually annihilate and the virtual particle outside becomes a real particle obeying the conservation of energy.

IMO Version 1 is correct. Version 2 has unresolved problems. In pair production, both particles have positive energy but one is an antiparticle. Assuming negative energy exists, what if the negative energy particle is outside and the positive energy particle is inside the event horizon? Then the black hole gains mass. Since the probability of negative energy particle being inside or outside is equal, the black hole neither evaporates nor grows. Surprisingly, Hawking himself invokes Version 2 in his book.

The experiment does not seem to test either Version 1 or Version 2. It is a classical analogue that demonstrates the classical laws of thermodynamics.

OpaManfred,

If you think I am wrong pointing out that in science it is necessary to check whether predictions are actually correct you're in the wrong corner of the internet. The flat earthers are down the hall on the left.

Phil,

You are confusing the average velocity of the particles with the overall velocity of the fluid. Forget about the superfluid, think of a bucket of water. It has a temperature even if the water is perfectly still. If the bucket moves at constant velocity, this doesn't change the temperature.

Lawrence,

The original derivation of the firewall issue makes no reference to a boundary condition.

Enrico,

You clearly have strong opinions, but very little arguments.

Sabine,

Instead of making snappy remarks, why don't you point out what is wrong in my (rather clear, I think) argument?

If it can help, I spell it out for the present case. A "wrong" temperature result here would have meant one of the following: (i) S Hawking (may he rest in peace) got his maths wrong; (ii) the experiment is wrong; or (iii) the discrepancy is due to some higher-order effects for which the analogy black hole / BEC breaks down.

Of course, the experiment on the BECs would tell us something about BEC physics. In the above case of "wrong temperature", it would tell us that the BEC does not obey exactly the same equations as the BHs.

But the experiment would tell us NOTHING about BHs, for the very reason that the discrepancy can only come from effects that are beyond the analogy. To claim that it tells us something about BHs, you would need to assume that the two systems (BEC and BH) are absolutely identical to all orders, which is a rather preposterous claim of course.

Seriously, Sabine, are you saying that you can learn something about BHs by performing experiments on a drop of fluid on Earth? You can only test and check what is already in the mathematical model of BHs, which constitutes the basis of the analogy.

I hope I'm being clearer now.

@ Enrico: The Hawking radiation has a wavelength on the order of the Schwarzschild radius according to an asymptotic observer in distant flat spacetime. This radiation is generated on average around $4GM$, as I recall. It has more kinetic energy near the horizon, but that is lost by red shift as measured by the external observer. The wavelength is stretched out to about the Schwarzschild radius.

@ Sabine: The firewall is a bit of an ad hoc assumption. There are variations on this theme. The idea is if the unitary principle is upheld the equivalence principle fails. This means by some means nothing approaching a black hole can enter. The firewall is the gadget that is this ultimate sentry.

Some people seem to think this is not an appropriate experiment because this does not involve an actual black hole. Since this is an analogue it is thought to be more of a simulation. In part that is the case, but it is also close enough of an analogue to be considered seriously. There are certain solids that have an index of refraction n → ∞ such that the speed of light in the medium c. = c/n is zero. Now assume there is a material that has a dependency n = n(x) such that this diverges at x = horizon distance. Further assume that I have a quantum state that is entangled to a quantum state approaching this horizon distance. The state approaching the black hole becomes trapped and it is not possible to perform an entanglement swap, AKA teleportation of states, and so the entanglement is shifted to an entanglement with the black hole. This is the absorption of a photon, that is complex conjugate dual to the creation of a photon. The creation is simply an analogue of Hawking radiation.

Experiments of this sort with quantum optics and the rest have been performed. Any sort of limit imposed on the speed of phonons or other quanta by even just the flow of water carries similar physics. So these experiments are steps in the right direction. I think more direct measurements will be contained in gravitational wave signatures in colliding black holes. The tidal force on a system between two black holes just before collision is huge and should have quantum fingerprints.

OpaManfred,

As I wrote in my, I think pretty clear response to Topher: "Physics isn't math. Even if you have no doubt that the calculation is correct, you still have to check that it actually applies to nature. That's how it works in science. A wrong temperature would have implied that something is going on which our theories don't properly describe."

Please let me know which part of this you think is "wrong".

Sabine,

As a complement to what I just wrote. You say "A wrong temperature would have implied that something is going on which our theories don't properly describe."

The correct version would be: "A wrong temperature would have implied that something is going on in BECs which our theories of BECs don't properly describe."

You certainly could not deduce that our theories for BHs are wrong on the basis of this experimental result on BECs. You may only deduce that either the calculations based on those BH theories are wrong or that the analogy is imperfect.

“Seriously, Sabine, are you saying that you can learn something about BHs by performing experiments on a drop of fluid on Earth? You can only test and check what is already in the mathematical model of BHs, which constitutes the basis of the analogy.”

Well, it is a bit of a stretch, but if you have two very diverse phenomenon evidencing the same mathematics, you could imagine that there is common organizing principle operating on a deeper, proto-physical level that is shaping the dynamics of them both and hence, what was true for one would also be true for the other.

OpaManfred,

You write

"A negative result on the BEC can only mean that either the calculation on the BH was wrong or that the analogy has broken down somewhere. (I'm repeating this for the third time; perhaps YOU can tell me which part of this you think is "wrong")."Your statement "They just reveal some consequences of a mathematical theory" is wrong. You have a mathematical theory that makes a prediction for a physical system, then you go and test it. Otherwise you will not know that the mathematics actually applies to the system under question.

I already wrote in my response to Adrian that, yeah, maybe the analogy is wrong, but then you have to come up with an explanation for that. Look, if a prediction of a calculation doesn't fit the data, you can't just shrug and mumble "well, too bad, ought to be some reason for that."

OpaManfred

"As a complement to what I just wrote. You say "A wrong temperature would have implied that something is going on which our theories don't properly describe."

The correct version would be: "A wrong temperature would have implied that something is going on in BECs which our theories of BECs don't properly describe." "

What you call the "correct version" is, needless to say, what I am referring to. What else could you think I might possibly have meant? Also, please stop assigning your misunderstandings to me.

OpaManfred,

Again,

"Seriously, Sabine, are you saying that you can learn something about BHs by performing experiments on a drop of fluid on Earth? You can only test and check what is already in the mathematical model of BHs, which constitutes the basis of the analogy."As I wrote, you test the result of Hawking's calculation. This calculation works the same for the black hole as for the superfluid. If you understood something else, you misunderstood.

Sabine

So you agree with me that these experiments test BEC physics, but not BH physics.

They only test BH maths, namely the derivation of hawking radiation from the eqs of QFD on curved spacetime. In no way can these experiments tell you whether the equations are right or wrong for BHs.

Don Foster,

Yes you could, but that would be a VERY big stretch.

Nice to see that at least one person can see my point though...

OpaManfred,

"They only test BH maths, namely the derivation of hawking radiation from the eqs of QFD on curved spacetime. In no way can these experiments tell you whether the equations are right or wrong for BHs."Of course not, as I said, you have to go and test whether the prediction of a theory are actually correct. My point is that the string theory predictions for black holes should apply for this system for the same reason Hawking's calculation should apply. Unfortunately they can't both apply, and the data speaks for Hawking.

I'm reluctant to step back into this discussion. After reading through this I feel that the experiment was certainly interesting and worthwhile. But I'm still skeptical that this has much bearing on firewall theories or string theory predictions or anything other theories of black holes. The analogy is only instructive as long as GR contains the full description of the (1-dimensional) black hole and that the fluid system is a faithful analogy to all/enough (I'm not sure here) orders of approximation. Assuming all the condensed matter stuff is right, this experiment shows that indeed the *GR description* of a black hole correctly *implies* radiation with the Hawking temperature.

So I don't see why this analogy should work in the other direction. GR does not imply string theory for example, its analogy's success cannot have any implications on string theory.

Unproven arguments are opinions. Mathematical proofs are for math. In physics, we need empirical evidence. There's no solid empirical evidence for Hawking radiation but a lot of opinions.

@ Lawrence: Let's test your opinion. What is the gravitational or Coulomb barrier and quantum tunneling in the experiment?

@Sabine

We're both making some progress then! I perfectly agree with your sentence: "As I wrote, you test the result of Hawking's calculation."

Then again, I'm confused. Why did you object to Topher's first statement that "this experiment does nothing other than check some people's math". If you replace "some people" with "Hawking", you are saying the same thing.

@Sabine, @Topher

Here again, I agree with Topher's comment of Sept 26. The experiment on BECs can be informative on BHs only as long as the mathematical analogy holds. And this holds for a BH description based on GR+QFT. But probably the analogy does not hold if you start adding other bits to the theory, like firewalls or strings.

That's why I disagree with Sabine's statement that "this result documents that black holes in a box do not behave like string theorists think they should". It's incorrect because string theory is a theory of real gravitational BHs, not of their fluid analogues. (Actually your statement is a bit ambiguous: if by "BHs in a box" you mean the fluid analogue, then the statement is correct, but irrelevant to string theory; if you mean a real BH, then it's wrong).

So the experiment can not prove or disprove anything about string theory as a theory of quantum gravity.

" But I'm still skeptical that this has much bearing on firewall theories or string theory predictions or anything other theories of black holes."Let us take another example, the two slit experiment. The mathematics of the Schrödinger equation lets us believe that particles behave like waves. So, we do an experiment using water waves passing through two slits. This experiment shows us an interference pattern in the waves emanating from the slits.

This experiment does not prove that particles passing through two slits will show an interference pattern. But it does show us that if they don't, the Schrödinger equation probably is not the correct equation describing.the particles.

In this case things were more complicated. We did not know before whether Hawking radiation was actually an observable part of systems described by the mathematics Hawking used. It also shows that this system does not develop the equivalent of a firewall. Now we know that a system that is described by Hawking's formula's does indeed show this radiation. If BH's do not, then there must be something wrong with the mathematics.

OpaManfred,

An experiment does not check that the math is correct, it checks that the theory applies to the system.

@sabine,

Look, for the sake of the discussion, could you stop repeating very general and ill-defined statements like "an experiment ... checks that the theory applies to the system". Can we get specific about these analogue experiments ? otherwise it's pointless.

As I already said, these analogue experiments are different. You have TWO systems, not one: so you have to specify which system you are referring to. Also, please specify what is "the theory" you refer to in that sentence.

I repeat to the risk of boredom. The following two statements are correct:

- An experiment on cold atoms checks that the theory of cold atoms (basically, QM) applies to a system of cold atoms.

- An experiment on BHs checks that the theory (QFT+GR) applies to a BH.

The problem is when you mix these two statements and try to claim that an experiment on cold atoms can tell you something new about the theory of BHs (e.g., that one has to modify GR by adding firewalls or going to string theory). This claim is not correct.

Or maybe we have a semantic disagreement on what we mean by "theory". For me, it is the underlying basic theory, namely QM for BECs or Gr for BHs. this is distinct from what I call a "calculation", which means deriving some consequences from the postulates of the theory. Hawking's is a calculation, not a theory, at least in my language.

OpaManfred,

How often do I have to repeat that I do not claim and have never claimed, not here and not elsewhere, that an experiment on cold atoms tells you what goes on with black holes?

Topher,

The limit in which you do the string theory calculations does not actually contain string effects. It's quantum field theory in a curved (non-quantum) background space. That is exactly the limit in which Hawking's calculation works. If anything, the string theory calculation has better chances to apply to the superfluid than to actual black holes because the latter doesn't normally sit in a box.

@ Enrico: The potential well problem seen in a basic undergraduate quantum mechanics course is maybe not the best model. I think it better to think of the quantum wave as distributed in space and the position of a particle is just a probability for its occurrence evaluated as (x) = ∫d^3x ψ^*xψ, here the parentheses stand for carrot or bra and ket angles that don't show up right here. This means a particle in the black hole has a wave function that is distributed through the black hole and has some amplitude that is beyond the event horizon.

What the quantum tunneling “fights against” is less a huge potential than it is the flow of space. In a space plus time model of relativity we can think of space as evolving so that points flow into the black hole, which in turn frame drags particles with it into the black hole. Take a look at the JILA website on black holes http://jila.colorado.edu/~ajsh/insidebh/waterfall.html where this page discusses this in the context of the waterfall as a flow of space. The wave function of a particle may extend beyond the horizon, but because the flow of space inside the black hole is faster than the speed of light there is no causal connection between this part of the wave function and the interior. This is why Hawking radiation comes out in a manner that appears highly random. This is entanglement entropy that can only be removed if a signal is communicated from the interior concerning the state of the entanglement pair to the outside. This would be the “entanglement swap” or telecommunication concerning the particle state in the interior. Think in normal space where this classical signal is needed to orient a Stern-Gerlach apparatus appropriately so the receiver can receive a teleported state. However, the horizon makes that impossible. The entropy here is then an entanglement entropy that in a coarse grained setting translates into a thermal entropy.

A lot of these arguments against this experimental science are too harsh. In a biomedical setting the experiments are conducted on mice or other “critters” before they are done on humans. The efficacy of some therapy or drug is measured on the animal model before going to human subjects. Folks, this is a physics analogue of just that.

The firewall paradox isn't specific to string theory, right? It's a potential problem for any theory describing the late stages of black hole evaporation?

I believe the actual situation in string theory is that there isn't any clear description of black hole evaporation proceeding to its end. The black holes that *are* understood microscopically are eternal black holes that are in equilibrium with a surrounding gas.

So within string theory, the "firewall debate" consists of a clash of ideas regarding how the theory *might* describe the late stages of evaporation. But only a minority of string theorists believe in an actual firewall. That view got promoted because Joseph Polchinski advocated it. Others think the paradox is avoided differently, e.g. through subtleties of the fuzzball description, or through an algebraic subtlety regarding operators behind the horizon.

@Mitchell: The firewall is just a signature of a theoretical obstruction, and it is not specific to string theory. If you do not have a firewall then unitarity is violated at the Page time when the entanglement entropy of a black hole has increased and becomes the entire Bekenstein entropy. If the black hole continues to emit Hawking radiation then entanglement symmetry is violated and the Bekenstein bound is broken. You can look at the problem either way and it suggests there is some relationship between gravitation and quantum mechanics this way. The two obstructions are equivalent. If you impose the firewall you can recover unitarity, but you violate the equivalence principle by making the horizon into some sort of singularity. So the firewall removes an obstruction to quantum mechanics by imposing it in spacetime.

In fact the Ryu-Takayanagi (RT) formula might come to bear on this and it might suggest the firewall is not hot, but maybe very cold! In the BTZ black hole it is not hard to show the extremal condition, angular momentum parameter a equals mass a = m, is zero temperature. The same hold in 4-dim. So the Page time might be an endpoint in a phase of the quantum evolution of a black hole where the BPS charge on the stretched horizon is extremal and the black hole becomes something equivalent to a Bose-Einstein condensate. So rather than continually heating up as the horizon area decreases, temperature does so up to a point where its entropy approaches equality with entanglement entropy with Hawking radiation and instead it goes cold. This would be a sort of phase transition for the black hole. Just as Bose-Einstein condensates have a finite temperature so too would be the case here, so Hawking radiation would continue to be emitted, but the quantum or stat-mech phase or state of the black hole has changed.

I take long walks in the woods with my dogs thinking about stuff like this, such as how wormholes and chronology violation in spacetime are equivalent to violating the no-cloning theorem of quantum mechanics. Hey, at least it keeps my mind off the rubbish politics of the day! This all points to deep equivalencies between spacetime and quantum mechanics, and if there is some complementarity or duality principle the two above obstructions might be removed and some general understanding made in their stead.

Lawrence,

The proof for the firewall is just wrong. Look, I wrote a paper about this: it really is not all that hard to see. What makes the four assumptions incompatible is a fifth assumption that the original paper uses (it's stated in the appendix). That assumption is that the state is "typical". It is this assumption which requires that energy-conservation is violated at the horizon, leading to a "firewall". Drop this assumption, and everything works fine. Explicit example is in the paper.

I read your paper through, though not completely. I would say as a general comment it is not usually the case that a theory is disproven by another theory.

Sections 1 through 3 read similar to Wald's book on black hole thermodynamics and quantum mechanics. Section 4 is where you make your assumption of a reflecting boundary condition. This permits an observer at I^+ to have information from I^- as well. In the treatment of qubits it appears this acts as a sort of beam splitter and quantum information is then not lost. I have no major fuss with this, though it seems to me this assumption of a reflecting surface is a way of replacing a firewall with a permeable semi-reflecting wall. If the stretched horizon is reflecting, then this is a bit of a “bump” in spacetime that adulterates the equivalence principle. This should reflect all quantum states. In effect a massive particle in the zitterbewegung picture is a massless particle scattered into a small region, where the scattering might be done by the Goldstone boson the particle has absorbed. So this should too partially reflect, and ultimate the same for all things. So everything that enters becomes a superposed state of in and reflected out. This does have the advantage of replacing a hot violation of the equivalence principle with a less violent one.

A black hole evaporates in a time proportional to the cube of its mass. The Page time when this entanglement problem is apparent sets in at around half the mass of the black hole. So for a solar mass black hole its evaporation time is 10^{67} years and one with half that mass would evaporate at about 10^{66} years. So at about .9 the time for a solar mass black hole evaporation the “new physics” should kick in. Whether this is a firewall, a partial mirror or beam splitter or maybe a BEC “ice wall” this should be apparent around this time. A newly formed black hole should not require any new physics for there is no problem with entanglements. So as you admit there is some unknown physics with respect to this mirror. This is really where the questions lie.

In pondering this mirror it does appear there must be some sort of nonlocal physics at work. This might be similar to Hanbury Brown and Twiss physics here. Either that or something related such as the Wheeler's Delayed Choice Experiment. The mirror is most important with the influx of matter, such as in the grey area. The region ν > 0 pertains to after the caustic that generates the stretched horizon, and ν < ν_0 to the last influx of quantum states. For the implosion of a star this is a lot of hot and dense material imploding inwards. If there is some nonlocality that makes the partially reflected photon or other particle appear at I^+, this might then avoid this problem. It would be similar to the HBT effect where two different photons are entangled at the detector; the detection is a ex post facto determinant of entanglement. In this case the stretched horizon and this “reflection” are a sort of quantum error correction that is based on some entanglement or nonlocal physics with states at I^+.

In pondering this mirror it does appear there must be some sort of nonlocal physics at work. This might be similar to Hanbury Brown and Twiss physics here. Either that or something related such as the Wheeler's Delayed Choice Experiment. The mirror is most important with the influx of matter, such as in the grey area. The region ν > 0 pertains to after the caustic that generates the stretched horizon, and ν < ν_0 to the last influx of quantum states. For the implosion of a star this is a lot of hot and dense material imploding inwards. If there is some nonlocality that makes the partially reflected photon or other particle appear at I^+, this might then avoid this problem. It would be similar to the HBT effect where two different photons are entangled at the detector; the detection is a ex post facto determinant of entanglement. In this case the stretched horizon and this “reflection” are a sort of quantum error correction that is based on some entanglement or nonlocal physics with states at I^+.

Lawrence,

Thanks for looking at the paper, but I am afraid you misunderstand this. It is emphatically *not* a mirror. As I explain in the paper, the disentanglement swap does nothing to incident particles that don't have exactly the same entanglement that you find in the vacuum. As a consequence, the disentanglement acts on the vacuum only. It has no observable consequences besides changing the vacuum structure. Infalling particles are entirely unaffected.

Indeed, it's a rather philosophical construction since the vacuum state is chosen by assumption to begin with. So you could equally well say I am merely assuming a different vacuum (and showing how to explicitly construct it).

This construction does maintain the equivalence principle in the commonly used sense: Deviations from flat space become relevant only at the curvature radius. There are various variants of the equivalence principle and actually it's not all that clear exactly what it means for a qft to obey it, but you will find that the construction in the paper obeys it as well as Hawking's original calculation.

The swap itself is entirely local. What I suspect requires non-local conditions is if you want to also encode information in the radiation. Best,

B.

The trans-Planckian problem of the Hawking derivation does not go away, and cannot:

In the dumb hole analog, you have a continuous flow of what is the analog of the "ether". Thus, you have a changing background, which is what you need to get a non-trivial Bogoljubov transformation.

To have such a continuous flow, you need a continuation of the collapse toward Schwarzschild radius plus epsilon, with epsilon becoming exponentially arbitrary small with time. If you stop the collapse at r_S + l_Planck, or at r_S + 10^{-1000}l_Planck, the Hawking radiation stops after a few seconds or so. The result is robust and unavoidable - stable stars do not Hawking-radiate.

If one thinks that semiclassical theory (we have nothing better) is sufficient to predict what happens with the collapsing start when the surface time dilation is so large that a Planck time on the surface equals to 10^{1000} ages of the universe, so be it. I think Hawking radiation is plausible only during the collapse itself, and about what happens a second later nothing is known.

I have read your paper through again, but time limitations prevent hard study. I still unfortunately have not had the time to really intensively work through it. That might have to wait for Saturday. I have some observations, and your comment at the end of your last post is in part what I am driving at:

The swap itself is entirely local. What I suspect requires non-local conditions is if you want to also encode information in the radiation.The C matrix or operator prepares the quantum state into the vacuum you “want,” and this gives this sort of vacuum beam splitter.

My speculation then is that there is information encoded in the vacuum at I^+. This may be gravitational radiation or BMS symmetry. This would then save us from the problem of entanglement loss. This C operator or a form of C-NOT operation that demolishes entanglement, is a macroscopic process. In effect entanglement is buried away in complex macroscopic environment or a reservoir of states. Raamsdonk has proposed how spacetime can in fact be just this. As such this entanglement problem with the firewall just means we fail to account for an entanglement with I^+ that is there all along. Then when the Page time is reached we make the conclusion there is a change in entanglement with old Hawking radiation. Your C operation demolishes this entanglement, but in fact it is just buried away.

This would be required if we think that quantum phases are conserved. Unitary evolution conserves quantum phase, or equivalently superpositions and entanglements, and decoherence is a process thought to involve the loss of such phases to a reservoir of states. If we think of quantum mechanics as being the ultimate bedrock, everything is ultimately unitary, then entanglement demolition just means entanglement is transferred elsewhere and “hidden away.” If so then it makes some reasonable sense that I^+ is entangled nonlocally with the stretched horizon of black holes. The information that reaches I^+ is the vacuum configuration of a gravitational wave. This would amount to a quantum mechanical treatment of Hawking radiation back reaction.

Lawrence,

I think what you say is a possible way to interpret it. I am phrasing this so carefully because operationally it's a strange thing to say that information is encoded at I^+. I guess I'd say instead that the vacuum state isn't entirely determined by Hawking's original requirements which boil down to no particles at I^- and no firewall at the horizon. There is instead a huge number of vacua which fulfill these properties, and you can use those (at least in principle) to encode information. The operation that I construct in the paper is one example for how to do that.

The reason you can do this is that the requirement that there's no firewall at the horizon is a requirement about the spectral energy density and not a requirement about the state. There are ways to change the entanglement in the state that don't change the spectral energy density, hence the degeneracy.

Yes, it may have something to do with the BMS charges. I have considered this at some point but then I couldn't see any obvious connection. But tbh I didn't think about it all that much. If you see any link there, I'd be interested to hear. Best,

B.

You wrote:

Yes, it may have something to do with the BMS charges. I have considered this at some point but then I couldn't see any obvious connection. But tbh I didn't think about it all that much. If you see any link there, I'd be interested to hear.This is music to my ears. This is entirely along the lines I have been thinking.

I have to break this up due to limitations. I hope this is not too inconvenient.

What I have been thinking along these lines involves an aspect of vacua and black holes. There is the Boulware vacuum and other oddities, such as the indeterminacy between vacuum and particles. An accelerated observer observes themselves to be in spacetime that appears to be an anti-de Sitter spacetime with negative energy ≤ 0 on Boulware vacuum |B>. The spacetime near a black hole is similar to an AdS spacetime, and the curvature of this is negative with Λ < 0, and so the vacuum is Boulware. Carroll and Randall showed occurrence of AdS_2×S^2 with extremal black holes in the region r_+ < r < r_- in the limit r_+ - r_- → 0 is a case where Boulware vacuum exists, and is extended into the interior.

Black holes define for Rindler wedges an infinite number of possible vacua. A struggle has been to understand their relationship. This may even be a problem with string theory, for the type I bosonic string has a negative energy vacuum and tachyon state. While the superstring removes the tachyon the theory is still preferable in a background with either flat Minkowski or AdS metric. For supersymmetry this does make sense since the Hamiltonian H = ½{Q, Q-bar} = 0 is broken symmetry for a positive energy H > 0. There are also the various string types, Type II superstring, Type IIA&B and the heterotic strings with different vacua. M-theory makes argument these are all related to each other by STU dualities, but an explicit transformation principle is not known.

This matter with the horizon, and by extension the vacuum, might be seen with the two slit experiment and the Heisenberg microscope argument. The two slit experiment wave function is a superposition of states through the two slits. An ensemble of experiments produces a wave pattern on the detecting plate. Now consider the resolution of a screen and the Planck length. Suppose a wave emerges from a spherical screen of radius R with pixel size ∆x. The information reaches the center in T = R/c. The total number of pixels on this screen is 4πR^2/∆x^2, which per time 2R/c is equal to the Planck rate of transmission 1/T_p, and so ∆x = sqrt{2πRcT_p}, which is very small but much larger than the Planck scale. For the two slit experiment the resolution of the two slits is set by a limit

∆x = sqrt{DcT_p}

for D = 2πR. For smaller two slits separation the double slit experiment is indistinguishable from a single slit experiment.

The angular uncertainty in the wave is ∆θ = ∆x/D = cTp/D, and angular uncertainty reduces at larger distance. Direction has a clearer physical meaning clearer at larger scales. The means square uncertainty in distance becomes

= DcT_p/sqrt{2π},

which is considerably larger than the Planck area.

Distance emerges at large scales with increased number of Planck units of information on a screen. If physics could be measured on the Planck scale, the angular uncertainty would be huge. This is the Heisenberg microscope, which illustrates how spatial directions on the Planck scale are not well defined. What is important is the topology induced by the slits a 2D holographic system. The screen and slits define ”topological logic gate” information about spacetime. This is a double slit version of how entanglement gives rise to geometry.

In this approach distances or even paths are not relevant, but rather instead topologically distinct setting of a logic gate. The fundamentally important information is not the position in space or time of a particle, but rather the ”processing” of qubits according to logic gates with topological features. The wave function spreads across the horizon so it impossible to determine which gate does the processing, but each gate is entangled with a qubit for the particle escaping to infinity. Every particle interacting with the black hole has an amplitude for escape to infinity, which can include the emission of Hawking radiation. The larger the horizon area means the holographic screen has a greater resolution or angular certainty ∆θ ≈ cT_p/D decreases. The black hole becomes more classical and the geometry of gravitation assumes more meaning.

The process of looking ever closer at the horizon is a matter of choosing a different vacuum. In the two slit experiment working on different frames changes the resolution, but it also means a different vacuum structure. This scaling associated with the classicality, or coarser resolution, means there is a scaling D >> ell_p, the Planck scale, that involves many qubits. Each Planck unit on the stretched horizon contains a qubit associated with a piece of information the black hole has absorbed. The black hole as a quantum system is built from a large number of entanglement swaps from EPR pairs to the black hole. The black hole is then composed of massive entanglement, observing closer to the horizon means changing the vacuum associated with a frame, and doing so means the observer is performing more fine grained observations with greater resolution.

Entanglement symmetris are of the form C^2×C^2× ... ×C^2/SL(2,R)×SL(2,R)×...×SL(2,R) or SU(n)/SU(n-m)×SU(m) = G_m(C^n) with the numerator forming the isometry group. A large entanglement has an isometry of the form SU(N) for N --> ∞. This has connections to the AdS/CFT correspondence that equates a large N for a conformal field theory ~ SO(N) or SU(N). The case for N → ∞ defines the Bott periodicity with π_{n+8}SU(∞) = π_nSU(∞). The process of building up entanglements also follows a similar pattern with bipartite entanglements pairing up into GHZ entangled states, those into exceptional E8 and so forth. This is an 8 fold cyclic homotopy equivalence. Connes proposed a similar 8 fold cyclic symmetry for general gamma matrices, and this has connections.

BMS symmetry describes the displacement of test masses after the passage of a gravitational wave. The BMS symmetry is a right product of the Poincare or Lorentz group with a group of displacements or rotations. The simplest case is an abelian set of translations. So the BMS is

BMS = SO(3,1)⋉ G

for G defined by the quotient G = BMS/SO(3,1). BMS symmetry, most often thought of as an abelian group, is in general a quotient space. For the isometry group SO(3,2) then AdS_4 = SO(3,2)/SO(3,1). This in general connects the CFT symmetry on the boundary of AdS with the bulk gravity given by AdS_n, for thinking of n with more general dimensions. This provides a connection between the gravitational bulk, made from entanglement with a large number of states, and conformal field theory on the boundary as .SU(N) for N → ∞.

This then connects with black holes, in particular with black hole AdS equivalency, and with the I^+ of a dS spacetime. This then means there is a correspondence between spacetime built from large N states in entanglement and the quantum states on I^+. There is then an identification between the two or some form of entanglement. In this way the entanglement that would occur with the third state with a black hole is identified with that on I^+. This is in some ways a form of ER = EPR that Susskind advocates.

There is a lot here I have worked out, where in part this involve Hermitian symmetric spaces. There are gaps as well. This is also not complete. However, I think the basis you identify is one where this identification is clear. In part I think that physics has had a prejudice against accelerated frames, and I think this means there is a fundamental equivalency between accelerated frames. This is a part of the Bott periodicity; the N scales the level of resolution, but the physics is the same.

Anyway this is a sketch. This format is not entirely convenient.

Sabine,

Your paper is an excellent proof that BH _can_ preserve information without a firewall!

I would think that this point is better made by Samir Mathur, in which he used (causal) string theory, rather than Std particle theory, to eliminate the firewall. So we also have a theory which explains _how_ to avoid the firewall!

I would think, therefore, that you would have commented on his work, especially his essay in the recent FQXi contest?

He is still frustrated by the inability of a (*) string to model information, and thus preserve it. I point out that problem has a solution (* which you don't want me to mention here?). I can, however, point out where Green, Schwartz and Witten went astray...

Anyway, I'd like to help you be less lost in the math... by shedding some light on fundamental combinatorial geometry issues.

Wayne Lundberg

@WRL,

Indeed this is the case. The only issue I can see is that it depends on a particular basis or coordinate condition. That is one point of what I wrote above. Bee indicated there might be some nonlocal connection to BMS symmetry weights or charges. This is something I have been pursuing for a couple of years. This should hold for more general vacua.

This is an old thread now, but the salient statement in the NYT piece on Hawking's last paper https://www.nytimes.com/2018/10/23/science/stephen-hawking-final-paper.html is:

Recent years have brought a glimmer of hope. Andrew Strominger of Harvard discovered that, when viewed from the right mathematical perspective — that of a light ray headed toward the infinite future — black holes are more complicated than we thought. They have what Dr. Strominger has called “soft hair,” in the form of those imaginary light rays, which can be ruffled, stroked, twisted and otherwise arranged by material coming into the black hole. In principle, this hair could encode information on the surface of the black hole, recording all those details that Einstein’s equations supposedly leave out.The relevant paper is below. This soft hair is the BH horizon dual of BMS charges at I^+ or future null infinity.

LC

https://arxiv.org/abs/1810.01847

Black Hole Entropy and Soft Hair

Sasha Haco, Stephen W. Hawking, Malcolm J. Perry, Andrew Strominger

(Submitted on 3 Oct 2018 (v1), last revised 16 Oct 2018 (this version, v3))

A set of infinitesimal VirasoroL⊗VirasoroR diffeomorphisms are presented which act non-trivially on the horizon of a generic Kerr black hole with spin J. The covariant phase space formalism provides a formula for the Virasoro charges as surface integrals on the horizon. Integrability and associativity of the charge algebra are shown to require the inclusion of `Wald-Zoupas' counterterms. A counterterm satisfying the known consistency requirement is constructed and yields central charges cL=cR=12J. Assuming the existence of a quantum Hilbert space on which these charges generate the symmetries, as well as the applicability of the Cardy formula, the central charges reproduce the macroscopic area-entropy law for generic Kerr black holes.

Subjects: High Energy Physics - Theory (hep-th)

@Lawrence Crowell,

Sorry for the long delay, my work had intervened a bit.

I had to look up the BMS supertranslations (direction-dependent time offsets). The direction-dependence implies some geometry, as does the use of BH surface. So it seems that the BH surface is at a time-stopped conformal interface with the 'info-preserved' portion of the BH, its 'interior'. But "preservation" should be, as suggested by the supertranslations, a mechanism which includes time-retarded information - at dimensions less than alpha-prime.

So it is obvious to wonder about that interface...

Consider: what is the simplest geometric solid that is space-filling in 3D.

..

The area of one surface of a tetrahedron is 1/4 its area, which is, conveniently, the '1/4 area' factor in the Bekenstein-Hawking area entropy law, btw.

Post a Comment