Tuesday, February 07, 2012 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Hogan's holographic noise doesn't exist

In recent years, the name of Craig Hogan – the proponent of the "holographic noise" – appeared several times on TRF. A year ago, I discussed a BBC program that showed Craig Hogan's Holometer or Hoganmeter, a rather large and fancy apparatus given the fact that the effect clearly didn't exist.

Interestingly enough, in 2005, Craig Hogan et al. also took photographs of CSL-1, a celestial object that was conjectured to be a cosmic string, an interpretation that turned out to be invalid. Many of us were kind of excited; I liked to reproduce Joe Polchinski's estimate that the probability of finding a cosmic string in a near future had been around 10 percent. Well, we were still sensible and the object wasn't what people hoped it would be.

But let me return to the Hogan noise. Backreaction just told us that some noise from the GEO600 gravitational wave interferometer – which placed Hogan on the BBC show etc. – has disappeared. Not a surprise.




Several readers have asked me to write about this conjecture over the years and I always resisted. Hogan's noise is non-existent, as I will discuss later. However, he likes the holographic principle which is a noble idea. Moreover, almost no one paid attention to Hogan's papers and I wanted this situation to stay this way. Every invalid paper that keeps the status it deserves – to be overlooked – is a relief in a world where we're drowning in the ocean of crackpottery.

A majority of the theoretical physics community has gradually been morphing into an amalgam of cranks vigorously promoting their own pet crackpot theories and paying no attention to the insights that have actually been achieved by the scientific method and that form the bulk of the reason that physics is the Empress of the Natural Sciences. These cranks spend much of their time by contacting the mainstream media, other tabloids, and sources of dumb propaganda in general; silly and dishonest P.R. battles are really what their existence boils down to. This process is known as the Smolinization of theoretical physics.

At any rate, one may look e.g. at this 2009 paper by Hogan. Much like various other papers, Hogan's paper wants to find and sell his own version of an uncertainty principle, one that is implied by quantum gravity. Again, he may say that certain things don't commute with each other and imply the existence of a new fog or a new noise. The only problem is that unlike the Heisenberg uncertainty, the Hogan uncertainty doesn't exist.

In quantum mechanics, we have commutators like
\[ [x,p ] = i\hbar \] This implies the uncertainty principle i.e. the inequality
\[ \Delta x \cdot \Delta p \geq \frac{\hbar}{2} \] In any quantum system, we find complementary observables whose commutators are proportion to \(\hbar\) multiplied by a quantity that was considered "finite" in the classical theory (or classical limit). In all these cases, we know what it implies for the inequalities.

Stringy inequalities

Are there additional inequalities that don't have this form? Does string theory or quantum gravity add a completely new class of similar inequalities? This question has been studied – with some obvious "positive" prejudices – by various smart folks such as Yoneya. In perturbative string theory, people would end up with inequalities like
\[ \Delta x \cdot \Delta t \geq \alpha' \] where \(\alpha'\) is \(1/2\pi T\), the inverse string tension (up to the numerical constant). This inequality was viewed as an important one – but only until the Second Superstring Revolution that has shown that the fundamental strings are just one type of objects in string/M-theory and there are many others that may become equally or more "fundamental" in various other limits; the "fundamental" strings are only superior in a particular weakly coupled limit.

Despite these lessons we have learned from the dualities, it's still true that the inequality above is true in some sense – you have to be careful about it. It even applies to some questions of D-brane physics even though D-brane physics apparently transcends the limitations of weakly coupled string theory. For example, D0-branes may resolve distances that are shorter than \(L_{\rm string}=\sqrt{\alpha'}\) but only if they move very slowly. The slow motion implies that one needs some time for them to move from one place to another. It follows that they can't measure too short periods of time.

Note that we still have \(\alpha'\) in the formula which is the inverse tension of the fundamental strings: so this constant and all inequalities in which the constant plays a privileged role depend on the choice of the particular objects, the fundamental strings. These inequalities inevitably violate the brane democracy, the qualitatively equal human rights of all kinds of strings and branes we find in string/M-theory. Brane democracy isn't a useful concept for weakly coupled calculations of any sort (in every limit, some objects are way more "fundamental" than others – usually the lightest objects are the elementary ones while all the heavier ones look like composites) but the brane democracy still seems as a legitimate qualitative "moral" conclusion of the dualities.

Quantum gravity inequalities

Are there other inequalities where \(\alpha'\) is replaced by \(G\), i.e. Newton's constant? They could be relevant for effects of quantum gravity. Note that in the \(c=\hbar=1\) units, \(G\) has the dimension of \({\rm Length}^{D-2}\) where \(D\) is the spacetime dimension.

Well, there exist such effects but they're very different from the simple uncertainty principle due to Heisenberg. All of them ultimately boil down to the fact that the entropy of event horizons (e.g. those of black holes)
\[ S = \frac{A}{4G} \] is proportional to the area of surfaces in the Planck units, up to the coefficient of one quarter. Note that \(A\) has the right dimension \({\rm Length}^{D-2}\), too: one dimension disappears because it's the time (and we're talking about surfaces at one instant of time) and another one because we're talking about surfaces.

A cute heuristic application of this basic Hawking-Bekenstein formula may be used, following an argument due to your humble correspondent, to estimate the radius of a black hole. What the entropy formula really says is that \((D-2)\)-dimensional surfaces shouldn't be smaller or much smaller than \(G\) because entropy can't be smaller than one (or one nat, not far from one bit). To say the least, things become "strongly non-macroscopic and different" when we talk about situations in which the entropy is much smaller than one nat.

In quantum gravity, it is probably always misguided to talk about limits on surfaces whose dimension isn't \(D-2\) simply because Newton's constant only has this particular dimension. I believe that this simple observation is enough to de facto kill most of the "new uncertainty relationship" in the literature. Their authors use wrong eyeglasses, they see a fuzzy world, and they declare the fuzziness to be a new paramount law of physics: that's very silly, indeed! Constants such as \(\alpha'\) and \(G\) may be important in string theory and quantum gravity but that doesn't mean that every inequality – or "fuzzy" statement – that incorporates these constants is valid! ;-)

However, I may offer you a right argument which may superficially look analogous to such nonsensical ideas but it's not.

Because of special relativity and/or the Wick rotation, limitation on \((D-2)\)-dimensional surfaces should apply to timelikes surfaces, too. So if you can resolve surfaces with \(D-3\) spatial dimensions and \(1\) temporal dimension – which is \(D-2\) spacetime dimensions in total – it should never be smaller than one Planck area of the appropriate dimension. One may exploit this observation to estimate the radius of black holes.

Black holes of mass \(M\) have a wave function that oscillates with periodicity \(1/M\): recall what Schrödinger's equation tells us in combination with \(E=Mc^2\). In the temporal dimension, the interference pattern of the wave function for the black hole's center-of-mass degrees of freedom may have (as a function of time) the extent as small as \(\Delta t\sim 1/M\). However, the \((D-2)\)-dimensional surfaces can't be smaller than one in Planck's units. So we have
\[ \Delta t\cdot \Delta R^{D-3} \geq G \] where we approximated the \((D-3)\)-dimensional areas by an estimate which is the appropriate power of the linear dimensions \(\Delta R\). If we combine the inequality above with \(\Delta t \sim 1/M\), we get
\[ \Delta R^{D-3} \geq GM, \qquad \frac{GM}{\Delta R^{D-3}} \leq 1 \] Up to unknown numerical constants, we learn about the mass-radius relationship for the neutral black holes. The last inequality really tells us that the black hole is the most massive object that may be squeezed into a volume. It works in any dimension. For example, in \(D=4\), we have reproduced \(R=2GM/c^2\) up to the numerical constant. The internal size of the black hole has to be this increasing function of the mass, otherwise we would effectively deal with sub-Planckian surfaces or entropies smaller than one – with a regime where physics is qualitatively different from the macroscopic and classical physics and geometry that we know.

Hogan and Matrix theory

My argument above was valid albeit very heuristic in character. After all, we have other methods to show that black holes of a given mass can't be made "even smaller". However, it apparently doesn't imply any new effects that could be observed. For masses and energies we can concentrate into a region, the corresponding Schwarzschild radius is so tiny that it is unobservable. Is there a way to make such effects observable?

Craig Hogan's strategy is to try to scale the "holographic effects" with a new macroscopic distance \(L\) which may really be astronomical. Effectively, he thinks that there are also similar "effects accumulated over large distances or time" that become observable.

He uses the BFSS Matrix Theory. You know, I know quite something about the theory and I can assure you that the nonzero commutators and inequalities that Hogan talks about don't hold in Matrix theory. For example, in his paper he talks about commutators such as the "noncommutative geometry-like"
\[ [x_1,x_2] = -ic t_{12} \lambda_P. \] Consequently, he ends up with inequalities of the type
\[ \Delta x_{\rm transverse}^2 \geq L \lambda_P \] which are meant to create an inevitable transverse noise for waves whose wavelength is \(\lambda_P\), among many other things.

Except that one may look at the actual equations of Matrix theory and see that none of these commutators is nonzero. The positions commute with each other and supersymmetry guarantees that all the "classically allowed positions of D0-branes and their bound states" are allowed at the quantum level, too. The last displayed inequality above obviously can't be a consequence of quantum gravity because it doesn't depend on \(G\) at all! However, in the \(G\to 0\) limit, one must reproduce non-gravitational physics in the flat Euclidean background spacetime. Hogan's rules don't have the right limit so they can't be right.

Like Mr Ng, Mr Smolin, Mr Amelino-Camelia, Ms Hossenfelder, and tons of other deeply confused people, Hogan is really trying to promote the fuzziness of his own understanding of physics into a universal law of physics. But Nature doesn't work in this way. Nature respects well-defined rules – and usually produces unambiguous (albeit probabilistic) predictions – even if the people are unable to do so. And things like the momentum conservation hold exactly – even in quantum gravity applied to asymptotically flat spacetimes.

Hogan may also be a victim of an incorrect interpretation of commutators of observables at different times. Because the evolution of each observable depends on others (in the Heisenberg picture), these commutators are generally nonzero. But these non-vanishing commutators don't imply any "unpredictability". These observables can't be measured simultaneously but we don't need quantum mechanics to see that they can't: they can't be measured simultaneously because the observable at a later time is, by assumption, measured at a later time than the observable at the earlier time. ;-)

What the nonzero commutator means is that the exact knowledge of the "earlier time" observable (i.e. the choice of its eigenstate as the initial state) isn't enough to unambiguously predict the observed value of the "later time" observable. But that's not shocking. In these "large separation" experiments, the transverse position of the wave obviously depends primarily on the initial transverse momentum, not the initial transverse position. And when we know the initial momentum or velocity, we may predict the final position (after a long delay) almost precisely. These comments boil down to ordinary physics of propagating wave packets; it is no quantum gravity. It is not even a rocket science.

So all the tricks that Hogan tries to add in order to convince himself (and/or others) that quantum gravity is adding some noise that grows with the separation are obviously invalid. One may suffer from a "wishful thinking" that quantum gravity effects could be directly observable "in the kitchen". Except that Nature doesn't care about such a wishful thinking. They're not observable "in the kitchen". And this is also a very good thing because the world would collapse if such new effects influenced common events around us. Quantum gravity simply isn't a part of the cooking arts; it isn't a part of the common laser interferometry, either. It is a very abstract discipline dealing with nearly inaccessible, extreme phenomena. It has always depended on meticulous maths, it still does, and it arguably always will.

That's also true for the black hole information puzzle and other issues. All the effects of quantum gravity – e.g. the ability of black holes to remember the information about the initial state, despite the causal diagram that makes it seemingly impossible to imprint this information into the Hawking radiation – are extremely small and invisible, hidden in an unmanageable entanglement of a very large number of degrees of freedom.

When we talk about exotic miracles attributed to the holographic principle, I find it much more plausible – although "less likely than Yes" – that the holographic principle could modify the motion of celestial bodies at a very low acceleration because some interference effects start to work very differently. Such effects would occur because of astronomically long interference patterns seen in the holographic description of celestial bodies. But note that the effect only has a chance to exist here because the small value of \(G\) is incorporated into very low accelerations of the celestial bodies (galaxies etc.).

If you perform experiments that are described by parameters of order one in the macroscopic (SI...) units, there can't simply exist any effects of quantum gravity. I think it makes no sense to try to explain some technical implications of Matrix theory in a blog because the number of people on this planet who understand Matrix theory at some technical level is of order one hundred and a vast majority of them doesn't read blogs.

But it may make sense to convey a broader point. And the broader point is that a vast majority of claims in the literature that quantum gravity may be observed "right behind the corner" are based on a rudimentary misunderstanding of some physical phenomena.

Add to del.icio.us Digg this Add to reddit

snail feedback (1) :


reader Plato said...

Hi Lubos,

If you perform experiments that are described by parameters of order one in the macroscopic (SI...) units, there can't simply exist any effects of quantum gravity.

Okay! Thanks.

Best,