Nature | News Feature

Theoretical physics: Complexity on the horizon

A concept developed for computer science could have a key role in fundamental physics — and point the way to a new understanding of space and time.

Corrected:

Article tools

SPL

When physicist Leonard Susskind gives talks these days, he often wears a black T-shirt proclaiming “I ♥ Complexity”. In place of the heart is a Mandelbrot set, a fractal pattern widely recognized as a symbol for complexity at its most beautiful.

That pretty much sums up his message. The 74-year-old Susskind, a theorist at Stanford University in California, has long been a leader in efforts to unify quantum mechanics with the general theory of relativity — Albert Einstein's framework for gravity. The quest for the elusive unified theory has led him to advocate counter-intuitive ideas, such as superstring theory or the concept that our three-dimensional Universe is actually a two-dimensional hologram. But now he is part of a small group of researchers arguing for a new and equally odd idea: that the key to this mysterious theory of everything is to be found in the branch of computer science known as computational complexity.

This is not a subfield to which physicists have tended to look for fundamental insight. Computational complexity is grounded in practical matters, such as how many logical steps are required to execute an algorithm. But if the approach works, says Susskind, it could resolve one of the most baffling theoretical conundrums to hit his field in recent years: the black-hole firewall paradox, which seems to imply that either quantum mechanics or general relativity must be wrong. And more than that, he says, computational complexity could give theorists a whole new way to unify the two branches of their science — using ideas based fundamentally on information.

Behind a firewall

It all began 40 years ago, when physicist Stephen Hawking at the University of Cambridge, UK, realized that quantum effects would cause a black hole to radiate photons and other particles until it completely evaporates away.

As other researchers were quick to point out, this revelation brings a troubling contradiction. According to the rules of quantum mechanics, the outgoing stream of radiation has to retain information about everything that ever fell into the black hole, even as the matter falling in carries exactly the same information through the black hole's event horizon, the boundary inside which the black hole's gravity gets so strong that not even light can escape. Yet this two-way flow could violate a key law of quantum mechanics known as the no-cloning theorem, which dictates that making a perfect copy of quantum information is impossible.

Happily, as Susskind and his colleagues observed1 in 1995, nature seemed to sidestep any such violation by making it impossible to see both copies at once: an observer who remains outside the horizon cannot communicate with one who has fallen in. But in 2012, four physicists at the University of California, Santa Barbara — Ahmed Almheiri, Donald Marolf, Joseph Polchinski and James Sully, known collectively as AMPS — spotted a dangerous exception to this rule2. They found a scenario in which an observer could decode the information in the radiation, jump into the black hole and then compare that information with its forbidden duplicate on the way down.

AMPS concluded that nature prevents this abomination by creating a blazing firewall just inside the horizon that will incinerate any observer — or indeed, any particle — trying to pass through. In effect, space would abruptly end at the horizon, even though Einstein's gravitational theory says that space must be perfectly continuous there. If AMPS's theory is true, says Raphael Bousso, a theoretical physicist at the University of California, Berkeley, “this is a terrible blow to general relativity”.

Does not compute

Fundamental physics has been in an uproar ever since, as practitioners have struggled to find a resolution to this paradox. The first people to bring computational complexity into the debate were Stanford’s Patrick Hayden, a physicist who also happens to be a computer scientist, and Daniel Harlow, a physicist at Princeton University in New Jersey. If the firewall argument hinges on an observer's ability to decode the outgoing radiation, they wondered, just how hard is that to do?

Impossibly hard, they discovered. A computational-complexity analysis showed that the number of steps required to decode the outgoing information would rise exponentially with the number of radiation particles that carry it. No conceivable computer could finish the calculations until long after the black hole had radiated all of its energy and vanished, along with the forbidden information clones. So the firewall has no reason to exist: the decoding scenario that demands it cannot happen, and the paradox disappears.

“The black hole's interior is protected by an armour of computational complexity.”

Hayden was sceptical of the result at first. But then he and Harlow found much the same answer for many types of black hole3. “It did seem to be a robust principle,” says Hayden: “a conspiracy of nature preventing you from performing this decoding before the black hole had disappeared on you.”

The Harlow–Hayden argument made a big impression on Scott Aaronson, who works on computational complexity and the limits of quantum computation at the Massachusetts Institute of Technology in Cambridge. “I regard what they did as one of the more remarkable syntheses of physics and computer science that I've seen in my career,” he says.

It also resonated strongly among theoretical physicists. But not everyone is convinced. Even if the calculation is correct, says Polchinski, “it is hard to see how one would build a fundamental theory on this framework”. Nevertheless, some physicists are trying to do just that. There is a widespread belief in the field that the laws of nature must somehow be based on information. And the idea that the laws might actually be upheld by computational complexity — which is defined entirely in terms of information — offers a fresh perspective.

It certainly inspired Susskind to dig deeper into the role of complexity. For mathematical clarity, he chose to make his calculations in a theoretical realm known as anti-de Sitter space (AdS). This describes a cosmos that is like our own Universe in the sense that everything in it, including black holes, is governed by gravity. Unlike our Universe, however, it has a boundary — a domain where there is no gravity, just elementary particles and fields governed by quantum physics. Despite this difference, studying physics in AdS has led to many insights, because every object and physical process inside the space can be mathematically mapped to an equivalent object or process on its boundary. A black hole in AdS, for example, is equivalent to a hot gas of ordinary quantum particles on the boundary. Better still, calculations that are complicated in one domain often turn out to be simple in the other. And after the calculations are complete, the insights gained in AdS can generally be translated back into our own Universe.

Increasing complexity

Susskind decided to look at a black hole sitting at the centre of an AdS universe, and to use the boundary description to explore what happens inside a black hole's event horizon. Others had attempted this and failed, and Susskind could see why after he viewed the problem through the lens of computational complexity. Translating from the boundary of the AdS universe to the interior of a black hole requires an enormous number of computational steps, and that number increases exponentially as one moves closer to the event horizon4. As Aaronson puts it, “the black hole's interior is protected by an armour of computational complexity”.

Furthermore, Susskind noticed, the computational complexity tends to grow with time. This is not the increase of disorder, or entropy, that is familiar from everyday physics. Rather, it is a pure quantum effect arising from the way that interactions between the boundary particles cause an explosive growth in the complexity of their collective quantum state.

If nothing else, Susskind argued, this growth means that complexity behaves much like a gravitational field. Imagine an object floating somewhere outside the black hole. Because this is AdS, he said, the object can be described by some configuration of particles and fields on the boundary. And because the complexity of that boundary description tends to increase over time, the effect is to make the object move towards regions of higher complexity in the interior of the space. But that, said Susskind, is just another way of saying that the object will be pulled down towards the black hole. He captured that idea in a slogan4: “Things fall because there is a tendency toward complexity.”

Another implication of increasing complexity turns out to be closely related to an argument5 that Susskind made last year in collaboration with Juan Maldacena, a physicist at the Institute for Advanced Study in Princeton, New Jersey, and the first researcher to recognize the unique features of AdS. According to general relativity, Susskind and Maldacena noted, two black holes can be many light years apart yet still have their interiors connected by a space-time tunnel known as a wormhole. But according to quantum theory, these widely separated black holes can also be connected by having their states 'entangled', meaning that information about their quantum states is shared between them in a way that is independent of distance.

After exploring the many similarities between these connections, Susskind and Maldacena concluded that they were two aspects of the same thing — that the black hole's degree of entanglement, a purely quantum phenomenon, will determine the wormhole's width, a matter of pure geometry.

With his latest work, Susskind says, it turns out that the growth of complexity on the boundary of AdS shows up as an increase in the wormhole's length. So putting it all together, it seems that entanglement is somehow related to space, and that computational complexity is somehow related to time.

Susskind is the first to admit that such ideas by themselves are only provocative suggestions; they do not make up a fully fledged theory. But he and his allies are confident that the ideas transcend the firewall paradox.

“I don't know where all of this will lead,” says Susskind. “But I believe these complexity–geometry connections are the tip of an iceberg.”

Journal name:
Nature
Volume:
509,
Pages:
552–553
Date published:
()
DOI:
doi:10.1038/509552a

Corrections

Corrected:

This article inadvertently underplayed the role of Daniel Harlow in bringing computational complexity to fundamental physics — he worked with Patrick Hayden from the start of their project. The text has been corrected to reflect this.

References

  1. Lowe, D. A., Polchinski, J., Susskind, L., Thorlacius, L. & Uglum, J. Phys. Rev. D 52, 6997 (1995).

  2. Almheiri, A., Marolf, D., Polchinski, J. & Sully, J. J. High Energy Phys. 2013, 62 (2013).

  3. Harlow, D. & Hayden, P. J. High Energy Phys. 2013, 85 (2013).

  4. Susskind, L. Preprint available at http://arxiv.org/abs/1402.5674 (2014).

  5. Maldacena, J. & Susskind, L. Fortschr. Phys. 61, 781811 (2013).

Author information

Affiliations

  1. Amanda Gefter is a freelance writer based in Cambridge, Massachusetts.

Author details

For the best commenting experience, please login or register as a user and agree to our Community Guidelines. You will be re-directed back to this page where you will see comments updating in real-time and have the ability to recommend comments to other users.

Comments

Commenting is currently unavailable.

sign up to Nature briefing

What matters in science — and why — free in your inbox every weekday.

Sign up

Listen

new-pod-red

Nature Podcast

Our award-winning show features highlights from the week's edition of Nature, interviews with the people behind the science, and in-depth commentary and analysis from journalists around the world.