Quantum Information

Putting certainty in the bank

A new way to manipulate quantum states resolves a long-standing conundrum about who knows what, and when and how, in the quantum world. The result is, as one has come to expect, startling and counterintuitive.

Claude Shannon's landmark 1948 theory of communication1 tackles a nuts-and-bolts question: how do we find the best way to communicate using a given resource, such as a telegraph line or a satellite antenna? To answer that question, Shannon first took a detour into more philosophical territory by working out how to quantify the elusive concepts ‘uncertainty’ and ‘information’. More than half a century on, quantum-information theorists have in many ways taken the opposite approach. Inspired by Shannon, but working with the notoriously counterintuitive theory of quantum mechanics, they seek to understand uncertainty and information in the quantum world by analysing the practical questions first, in the hope that the answers might then illuminate more fundamental conceptual issues.

On page 673 of this issue2, Horodecki, Oppenheim and Winter demonstrate how effective this approach can be by justifying, in operational terms, a definition of conditional uncertainty that had previously been widely rejected owing to its strange and apparently nonsensical properties. In their formulation, what had been pathological becomes profound: with quantum information, it is possible not just to be certain, but to be more than certain.

To understand what such a statement could mean requires first absorbing how information theorists think about uncertainty. Most readers will be able to decipher the following English sentence:

D r _ p / e _e r _ / t h _ r d / _ e t _ e r.

As the omitted letters can be inferred with near-certainty from the others, they don't contribute to the uncertainty about the sentence and can therefore be compressed away. In general, the ‘uncertainty’ of a data source is the amount of space in bits required to transmit its output reliably.

Now consider another sentence, this time with more than three out of every four letters deleted:

T_ _ _ / _ _ / _a_ _ _r / _ _ / _ _ a _.

It is no longer possible to decipher the sentence uniquely because there are many grammatically correct options. If some extra letters are provided, however, the task becomes feasible:

T_ _ s / i _ / _ a _ d _ r / t_ / r _ a d.

The gap between what was provided at first (four letters), and what allowed us to decipher the sentence (no more than ten letters) illustrates the notion of ‘conditional uncertainty’ — the amount of extra information required to decipher a message.

One of Shannon's seminal results1 was to find a simple formula for the uncertainty of a data source X. This function, usually written H(X), is known as the Shannon entropy of X. Conditional uncertainty can be represented in similarly simple terms. If Y is used to represent the information already given to the receiver — the analogue of the indecipherable four letters in our example — the amount of extra information that must be provided is H(X,Y)H(Y), a quantity known as the conditional entropy of X given Y (ref. 3).

This second formula is easy to interpret: the extra information required is equal to the uncertainty in the total message, consisting of both X and Y, minus the uncertainty owing to Y alone, which should be subtracted, as Y is already known.

Among its many intuitive features, the conditional-entropy function is always greater than or equal to zero. That's because there is potentially more to be ignorant of in two messages X and Y together than in Y alone, so the inequality H(X,Y)H(Y) holds. For example, X and Y could represent future issues of the Financial Times and The Wall Street Journal, respectively: readers who take the time to follow both newspapers will be intimately familiar with the practical meaning of the inequality! In the context of conditional uncertainty, the interpretation is again highly intuitive: the amount of extra information required to decipher a message cannot be less than zero bits or, equivalently, it is impossible to be more than certain about the outcome of an event.

Intuitive indeed, but alas no longer true in quantum information theory. Horodecki, Oppenheim and Winter analyse2 a quantum-mechanical version of the message-completion problem and find that the amount of extra information required can sometimes be less than zero qubits (a qubit is simply the quantum version of a bit). In their version of the problem, there are three participants: call them the sender, the receiver and the referee. The referee prepares a quantity of quantum information consisting of many particles, some of which he distributes to the sender and the receiver, and the rest he keeps for himself. The sender's job is to find an encoding that allows her to transfer her share of the information to the receiver using as few qubits as possible. To further isolate the quantum-mechanical features of the problem, the sender is also allowed to send old-fashioned messages consisting of bits at zero cost.

The authors show2 that the number of qubits that the sender needs to transmit is precisely S(A,B)S(B), where A now refers to the sender's particles, B to the receiver's particles and S is the von Neumann entropy, a direct quantum-mechanical generalization of Shannon's entropy. This formula is identical in form to the solution of the non-quantum version. With quantum particles, however, it is possible for A and B to be correlated in ways that are impossible in the classical situation4. In such cases, the systems A and B are said to be entangled (for a popular account of this phenomenon, see ref. 5). One consequence of entanglement is that conditional uncertainty, S(A,B)S(B), can sometimes be less than zero. In other words, the receiver can be more than certain!

In practice, if the receiver is more than certain, the sender doesn't need to transmit any qubits at all for the receiver to be able to decipher the message. So the receiver can put some certainty in the bank for a rainy day, in the form of extra entanglement with the sender that could be used to reduce the receiver's uncertainty about future messages. Entanglement is such a strong form of correlation that it can actually be used to send qubits from the sender to the receiver using a procedure known as quantum teleportation6. On the accounting ledger, therefore, having stored entanglement is almost as good as being able to communicate.

This neat and satisfyingly bizarre resolution disposes of a long-standing puzzle in quantum information theory: put simply, how to quantify who knows what. In more technical language, the puzzle was how to quantify conditional uncertainty. The formula S(A,B)S(B) had been proposed7, but was widely rejected because of its pathological tendency to become negative. Until now, no one had succeeded in finding a setting in which the formula's full range of positive and negative values would have a meaningful interpretation. (It was a quantum-information theorist's version of the famous conundrum from The Hitchhiker's Guide to the Galaxy: if 42 is the answer to Life, the Universe and Everything, what is the question?)

In addition to finally placing the quantification of uncertainty in quantum mechanics on a solid footing, the new result2 opens the door to solving many previously intractable problems in quantum information theory. The authors provide a sampling of these applications in their paper, including an easy solution to a quantum version of the problem of many cellphones trying to communicate simultaneously to a single base station8. This astonishing solution shows that one sender's quantum information can, despite its fragility, be used to help decode the other senders' transmissions at higher rates than would otherwise be possible. Once again, quantum information has proved to be more versatile and more surprising than anyone expected.


  1. 1

    Shannon, C. E. Bell Syst. Tech. J. 27, 379–423, 623–656 (1948).

    Article  Google Scholar 

  2. 2

    Horodecki, M., Oppenheim, J. & Winter, A. Nature 436, 673–676 (2005).

    ADS  CAS  Article  Google Scholar 

  3. 3

    Slepian, D. & Wolf, J. K. IEEE Trans. Inform. Theory 19, 461–480 (1971).

    Google Scholar 

  4. 4

    Bell, J. S. Physics 1, 195–200 (1964); reprinted in Bell, J. S. Speakable and Unspeakable in Quantum Mechanics (Cambridge Univ. Press, 1987).

    Google Scholar 

  5. 5

    Aczel, A. D. Entanglement: The Greatest Mystery in Physics (Wiley, London, 2002).

    Google Scholar 

  6. 6

    Bennett, C. H. et al. Phys. Rev. Lett. 70, 1895–1899 (1993).

    ADS  MathSciNet  CAS  Article  Google Scholar 

  7. 7

    Cerf, N. J. & Adami, C. Phys. Rev. Lett. 79, 5194–5197 (1997).

    ADS  MathSciNet  CAS  Article  Google Scholar 

  8. 8

    Yard, J., Devetak, I. & Hayden, P. preprint at http://arxiv.org/abs/quant-ph/0501045 (2005).

Download references

Author information



Rights and permissions

Reprints and Permissions

About this article

Cite this article

Hayden, P. Putting certainty in the bank. Nature 436, 633–634 (2005). https://doi.org/10.1038/436633a

Download citation

Further reading


By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.


Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing