Credit: ILLUSTRATION BY ANDY POTTS

If the truth be told, few physicists have ever really felt comfortable with quantum theory. Having lived with it now for more than a century, they have managed to forge a good working relationship; physicists now routinely use the mathematics of quantum behaviour to make stunningly accurate calculations about molecular structure, high-energy particle collisions, semiconductor behaviour, spectral emissions and much more.

But the interactions tend to be strictly formal. As soon as researchers try to get behind the mask and ask what the mathematics mean, they run straight into a seemingly impenetrable wall of paradoxes. Can something really be a particle and a wave at the same time? Is Schrödinger's cat really both alive and dead? Is it true that even the gentlest conceivable measurement can somehow have an effect on particles halfway across the Universe?

Many physicists respond to this inner weirdness by retreating into the 'Copenhagen interpretation' articulated by Niels Bohr, Werner Heisenberg and their colleagues as they were putting quantum theory into its modern form in the 1920s. The interpretation says that the weirdness reflects fundamental limits on what can be known about the world, and just has to be accepted as the way things are — or, as famously phrased by physicist David Mermin of Cornell University in Ithaca, New York, “shut up and calculate!”1

But there have always been some who are not content to shut up — who are determined to get behind the mask and fathom quantum theory's meaning. “What is it about this world that forces us to navigate it with the help of such an abstract entity?” wonders physicist Maximilian Schlosshauer of the University of Portland in Oregon, referring to the uncertainty principle; the wave function that describes the probability of finding a system in various states; and all the other mathematical paraphernalia found in textbooks on quantum theory.

Over the past decade or so, a small community of these questioners have begun to argue that the only way forward is to demolish the abstract entity and start again. They are a diverse bunch, each with a different idea of how such a 'quantum reconstruction' should proceed. But they share a conviction that physicists have spent the past century looking at quantum theory from the wrong angle, making its shadow odd, spiky and hard to decode. If they could only find the right perspective, they believe, all would become clear, and long-standing mysteries such as the quantum nature of gravity might resolve themselves in some natural, obvious way — perhaps as an aspect of some generalized theory of probability.

“The very best quantum-foundational effort,” says Christopher Fuchs of the Perimeter Institute for Theoretical Physics in Waterloo, Canada, “will be the one that can write a story — literally a story, all in plain words — so compelling and so masterful in its imagery that the mathematics of quantum mechanics in all its exact technical detail will fall out as a matter of course”.

A very reasonable proposal

One of the earliest attempts to tell such a story came in 2001, when Lucien Hardy, then at the University of Oxford, UK, proposed that quantum theory might be derived from a small set of “very reasonable” axioms about how probabilities can be measured in any system2, such as a coin tossed into the air.

Hardy began by noting that a classical system can be specified completely by measuring a certain number of 'pure' states, which he denoted N. For a coin toss, in which the result is always either heads or tails, N equals two. For the roll of a dice, whereby the cube must end up with one of six faces uppermost, N equals six.

Probability works differently in the quantum world, however. Measuring the spin of an electron, for example, can distinguish two pure states, which can be crudely pictured as a rotation clockwise or anticlockwise around, say, a vertical axis. But, unlike in the classical world, the electron's spin is a mixture of the two quantum states before a measurement is made, and that mixture varies along a continuum. Hardy accounted for that through a 'continuity axiom', which demands that pure states transform from one to another in a smooth way. This axiom turns out to imply that at least N2 measurements are required to completely specify a system — a relationship that corresponds to the standard quantum picture.

A lot of the features we think of as uniquely quantum are generic to many probabilistic theories.

But, in principle, said Hardy, the continuity axiom also allows for higher-order theories in which a complete definition of the system requires N3, N4 or more measurements3, resulting in subtle deviations from standard quantum behaviour that might be observable in the lab. He did not attempt to analyse such possibilities in any detail, however; his larger goal was to show how quantum physics might be reframed as a general theory of probability. Conceivably, he says, such a theory could have been derived by nineteenth-century mathematicians without any knowledge of the empirical motivations that led Max Planck and Albert Einstein to initiate quantum mechanics at the start of the twentieth century.

Fuchs, for one, found Hardy's paper electrifying. “It hit me over the head like a hammer and has shaped my thinking ever since,” he says, convincing him to pursue the probability approach wholeheartedly.

Fuchs was especially eager to reinterpret the troubling concept of entanglement: a situation in which the quantum states of two or more particles are interdependent, meaning that a measurement of one of them will instantaneously allow the measurer to determine the state of the other. For example, two photons emitted from an atomic nucleus in opposite directions might be entangled so that one is polarized horizontally and the other is polarized vertically. Before any measurement is made, the polarizations of the photons are correlated but not fixed. Once a measurement on one photon is made, however, the other also becomes instantaneously determined — even if it is already light years away.

As Einstein and his co-workers pointed out in 1935, such an instantaneous action over arbitrarily large distances seems to violate the theory of relativity, which holds that nothing can travel faster than light. They argued that this paradox was proof that quantum theory was incomplete.

But the other pioneers stood fast. According to Erwin Schrödinger, who coined the term 'entanglement', this feature is the essential trait of quantum mechanics, “the one that enforces its entire departure from classical lines of thought”. Subsequent analysis has resolved the paradox, by showing that measurements of an entangled system cannot actually be used to transmit information faster than light. And experiments on photons in the 1980s showed that entanglements really do work this way.

Still, this does seem an odd way for the Universe to behave. And this is what prompted Fuchs to call for a fresh approach to quantum foundations4. He rejected the idea, held by many in the field, that wave functions, entanglement and all the rest represent something real out in the world (see Nature 485, 157–158; 2012 ). Instead, extending a line of argument that dates back to the Copenhagen interpretation, he insisted that these mathematical constructs are just a way to quantify “observers' personal information, expectations, degrees of belief”5.

He is encouraged in this view by the work of his Perimeter Institute colleague Robert Spekkens, who carried out a thought experiment asking what physics would look like if nature somehow limited what any observer could know about a system by imposing a “knowledge balance principle”: no observer's information about the system, as measured in bits, can ever exceed the amount of information he or she lacks. Spekkens' calculations show that this principle, arbitrary as it seems, is sufficient to reproduce many of the characteristics of quantum theory, including entanglement6. Other kinds of restriction on what can be known about a suite of states have also been shown to produce quantum-like behaviours7,8.

Knowledge gap

The lesson, says Fuchs, isn't that Spekkens's model is realistic — it was never meant to be — but that entanglement and all the other strange phenomena of quantum theory are not a completely new form of physics. They could just as easily arise from a theory of knowledge and its limits.

To get a better sense of how, Fuchs has rewritten standard quantum theory into a form that closely resembles a branch of classical probability theory known as Bayesian inference, which has its roots in the eighteenth century. In the Bayesian view, probabilities aren't intrinsic quantities 'attached' to objects. Rather, they quantify an observer's personal degree of belief of what might happen to the object. Fuchs' quantum Bayesian view, or QBism (pronounced 'cubism')9,10, is a framework that allows known quantum phenomena to be recovered from new axioms that do not require mathematical constructs such as wavefunctions. QBism is already motivating experimental proposals, he says. Such experiments might reveal, for example, new, deep structures within quantum mechanics that would allow quantum probability laws to be re-expressed as minor variations of standard probability theory11.

“That new view, if it proves valid, could change our understanding of how to build quantum computers and other quantum-information kits,” he says, noting that all such applications are critically dependent on the behaviour of quantum probability.

Knowledge — which is typically measured in terms of how many bits of information an observer has about a system — is the focus of many other approaches to reconstruction, too. As physicists Časlav Brukner and Anton Zeilinger of the University of Vienna put it, “quantum physics is an elementary theory of information”12. Meanwhile, physicist Marcin Pawłowski at the University of Gdańsk in Poland and his colleagues are exploring a principle they call 'information causality'13. This postulate says that if one experimenter (call her Alice) sends m bits of information about her data to another observer (Bob), then Bob can gain no more than m classical bits of information about that data — no matter how much he may know about Alice's experiment.

Pawłowski and his colleagues have found that this postulate is respected by classical physics and by standard quantum mechanics, but not by alternative theories that allow for stronger forms of entanglement-like correlations between information-carrying particles. For that reason, the group writes in their paper, “information causality might be one of the foundational properties of nature” — in other words, an axiom of some future, reconstructed quantum theory.

What is striking about several of these attempts at quantum reconstruction is that they suggest that the set of laws governing our Universe is just one of many mathematical possibilities. “It turns out that many principles lead to a whole class of probabilistic theories, and not specifically quantum theory,” says Schlosshauer. This is in itself a valuable insight. “A lot of the features we think of as uniquely quantum,” he says, “are actually generic to many probabilistic theories. This allows us to focus on the question of what makes quantum theory unique.”

Poised for success?

Hardy says that the pace of quantum-reconstruction efforts has really picked up during the past few years as investigators begin to sense they are getting some good handles on the issue. “We're now poised for some really significant breakthroughs,” he says.

But how can anyone judge the success of these efforts? Hardy notes that some investigators are looking for experimental signs of the higher-order quantum correlations allowed in his theory. “However, I would say that the real criterion for success is more theoretical,” he says. “Do we have a better understanding of quantum theory, and do the axioms give us new ideas as to how to go beyond current-day physics?” He is hopeful that some of these principles might eventually assist in the development of a theory of quantum gravity.

There is plenty of room for scepticism. “Reconstructing quantum theory from a set of basic principles seems like an idea with the odds greatly against it,” says Daniel Greenberger, a physicist who works on quantum foundations at the City College of New York5. Yet Schlosshauer argues that “even if no single reconstruction program can actually find a universally accepted set of principles that works, it's not a wasted effort, because we will have learned so much along the way”.

He is cautiously optimistic. “Once we have a set of simple and physically intuitive principles, and a convincing story to go with them, quantum mechanics will look a lot less mysterious”, he says. “I think a lot of the outstanding questions will then go away. I'm probably not the only one who would love to be around to witness the discovery of these principles.”