A new way to manipulate quantum states resolves a longstanding conundrum about who knows what, and when and how, in the quantum world. The result is, as one has come to expect, startling and counterintuitive.
Claude Shannon's landmark 1948 theory of communication^{1} tackles a nutsandbolts question: how do we find the best way to communicate using a given resource, such as a telegraph line or a satellite antenna? To answer that question, Shannon first took a detour into more philosophical territory by working out how to quantify the elusive concepts ‘uncertainty’ and ‘information’. More than half a century on, quantuminformation theorists have in many ways taken the opposite approach. Inspired by Shannon, but working with the notoriously counterintuitive theory of quantum mechanics, they seek to understand uncertainty and information in the quantum world by analysing the practical questions first, in the hope that the answers might then illuminate more fundamental conceptual issues.
On page 673 of this issue^{2}, Horodecki, Oppenheim and Winter demonstrate how effective this approach can be by justifying, in operational terms, a definition of conditional uncertainty that had previously been widely rejected owing to its strange and apparently nonsensical properties. In their formulation, what had been pathological becomes profound: with quantum information, it is possible not just to be certain, but to be more than certain.
To understand what such a statement could mean requires first absorbing how information theorists think about uncertainty. Most readers will be able to decipher the following English sentence:
D r _ p / e _e r _ / t h _ r d / _ e t _ e r.
As the omitted letters can be inferred with nearcertainty from the others, they don't contribute to the uncertainty about the sentence and can therefore be compressed away. In general, the ‘uncertainty’ of a data source is the amount of space in bits required to transmit its output reliably.
Now consider another sentence, this time with more than three out of every four letters deleted:
T_ _ _ / _ _ / _a_ _ _r / _ _ / _ _ a _.
It is no longer possible to decipher the sentence uniquely because there are many grammatically correct options. If some extra letters are provided, however, the task becomes feasible:
T_ _ s / i _ / _ a _ d _ r / t_ / r _ a d.
The gap between what was provided at first (four letters), and what allowed us to decipher the sentence (no more than ten letters) illustrates the notion of ‘conditional uncertainty’ — the amount of extra information required to decipher a message.
One of Shannon's seminal results^{1} was to find a simple formula for the uncertainty of a data source X. This function, usually written H(X), is known as the Shannon entropy of X. Conditional uncertainty can be represented in similarly simple terms. If Y is used to represent the information already given to the receiver — the analogue of the indecipherable four letters in our example — the amount of extra information that must be provided is H(X,Y)–H(Y), a quantity known as the conditional entropy of X given Y (ref. 3).
This second formula is easy to interpret: the extra information required is equal to the uncertainty in the total message, consisting of both X and Y, minus the uncertainty owing to Y alone, which should be subtracted, as Y is already known.
Among its many intuitive features, the conditionalentropy function is always greater than or equal to zero. That's because there is potentially more to be ignorant of in two messages X and Y together than in Y alone, so the inequality H(X,Y)⩾H(Y) holds. For example, X and Y could represent future issues of the Financial Times and The Wall Street Journal, respectively: readers who take the time to follow both newspapers will be intimately familiar with the practical meaning of the inequality! In the context of conditional uncertainty, the interpretation is again highly intuitive: the amount of extra information required to decipher a message cannot be less than zero bits or, equivalently, it is impossible to be more than certain about the outcome of an event.
Intuitive indeed, but alas no longer true in quantum information theory. Horodecki, Oppenheim and Winter analyse^{2} a quantummechanical version of the messagecompletion problem and find that the amount of extra information required can sometimes be less than zero qubits (a qubit is simply the quantum version of a bit). In their version of the problem, there are three participants: call them the sender, the receiver and the referee. The referee prepares a quantity of quantum information consisting of many particles, some of which he distributes to the sender and the receiver, and the rest he keeps for himself. The sender's job is to find an encoding that allows her to transfer her share of the information to the receiver using as few qubits as possible. To further isolate the quantummechanical features of the problem, the sender is also allowed to send oldfashioned messages consisting of bits at zero cost.
The authors show^{2} that the number of qubits that the sender needs to transmit is precisely S(A,B)–S(B), where A now refers to the sender's particles, B to the receiver's particles and S is the von Neumann entropy, a direct quantummechanical generalization of Shannon's entropy. This formula is identical in form to the solution of the nonquantum version. With quantum particles, however, it is possible for A and B to be correlated in ways that are impossible in the classical situation^{4}. In such cases, the systems A and B are said to be entangled (for a popular account of this phenomenon, see ref. 5). One consequence of entanglement is that conditional uncertainty, S(A,B)–S(B), can sometimes be less than zero. In other words, the receiver can be more than certain!
In practice, if the receiver is more than certain, the sender doesn't need to transmit any qubits at all for the receiver to be able to decipher the message. So the receiver can put some certainty in the bank for a rainy day, in the form of extra entanglement with the sender that could be used to reduce the receiver's uncertainty about future messages. Entanglement is such a strong form of correlation that it can actually be used to send qubits from the sender to the receiver using a procedure known as quantum teleportation^{6}. On the accounting ledger, therefore, having stored entanglement is almost as good as being able to communicate.
This neat and satisfyingly bizarre resolution disposes of a longstanding puzzle in quantum information theory: put simply, how to quantify who knows what. In more technical language, the puzzle was how to quantify conditional uncertainty. The formula S(A,B)–S(B) had been proposed^{7}, but was widely rejected because of its pathological tendency to become negative. Until now, no one had succeeded in finding a setting in which the formula's full range of positive and negative values would have a meaningful interpretation. (It was a quantuminformation theorist's version of the famous conundrum from The Hitchhiker's Guide to the Galaxy: if 42 is the answer to Life, the Universe and Everything, what is the question?)
In addition to finally placing the quantification of uncertainty in quantum mechanics on a solid footing, the new result^{2} opens the door to solving many previously intractable problems in quantum information theory. The authors provide a sampling of these applications in their paper, including an easy solution to a quantum version of the problem of many cellphones trying to communicate simultaneously to a single base station^{8}. This astonishing solution shows that one sender's quantum information can, despite its fragility, be used to help decode the other senders' transmissions at higher rates than would otherwise be possible. Once again, quantum information has proved to be more versatile and more surprising than anyone expected.
References
 1
Shannon, C. E. Bell Syst. Tech. J. 27, 379–423, 623–656 (1948).
 2
Horodecki, M., Oppenheim, J. & Winter, A. Nature 436, 673–676 (2005).
 3
Slepian, D. & Wolf, J. K. IEEE Trans. Inform. Theory 19, 461–480 (1971).
 4
Bell, J. S. Physics 1, 195–200 (1964); reprinted in Bell, J. S. Speakable and Unspeakable in Quantum Mechanics (Cambridge Univ. Press, 1987).
 5
Aczel, A. D. Entanglement: The Greatest Mystery in Physics (Wiley, London, 2002).
 6
Bennett, C. H. et al. Phys. Rev. Lett. 70, 1895–1899 (1993).
 7
Cerf, N. J. & Adami, C. Phys. Rev. Lett. 79, 5194–5197 (1997).
 8
Yard, J., Devetak, I. & Hayden, P. preprint at http://arxiv.org/abs/quantph/0501045 (2005).
Author information
Affiliations
Rights and permissions
About this article
Cite this article
Hayden, P. Putting certainty in the bank. Nature 436, 633–634 (2005). https://doi.org/10.1038/436633a
Published:
Issue Date:
Further reading

The QBIT Theory of Consciousness
Integrative Psychological and Behavioral Science (2020)

Entanglement as elbow grease
Nature (2011)

Nonlocality and the Correlation of Measurement Bases
International Journal of Theoretical Physics (2009)
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.