Quantum systems are uncertain by nature. By 'squeezing' this uncertainty, physicists can make better measurements of quantities such as distance. But overdoing it makes things burst out all over the place.
At the leading edge of experimental science, the latest measurement techniques are promising to provide breakthroughs in our understanding of the Universe. The ever-improving ability to sense small displacements, for example, is at the heart of projects such as the Laser Interferometer Gravitational Wave Observatory (LIGO)1, which seeks to observe the faint space-time ripples of distant supernovae. When technical noise is strongly suppressed, the ultimate limit to the precision of any measurement is set by the quantum uncertainty in the measuring system. But even this quantum uncertainty can be reduced — a technique known as 'squeezing'. On page 67 of this issue, Shalm et al.2 show that squeezing down this quantum uncertainty is not as simple as might be expected — too much squeezing actually worsens measurement precision. Fortunately, they also show that it is possible to recover the best precision allowed by the laws of physics by looking at the 'over-squeezed' system in a different way.
That a fundamental limit to measurement precision exists at all is a purely quantum phenomenon. Consider light, the basis of a suite of sensitive interferometric measurement techniques. In the classical picture, light is a wave whose amplitude and phase — where the wave's peaks and troughs lie — can be specified with infinitesimal precision. But in reality, light has much more character. It is made up of indivisible photons that exhibit probabilistic behaviour when forced to decide which quantum state, out of a range of options presented, to be in.
Shine a hypothetical classical beam on a beam splitter that reflects 50% and transmits 50% of the light, and you know that exactly half goes each way. Put a grainy beam — composed of N photons, say — on that same beam splitter and it is exactly like tossing a coin N times. You know that on average each outcome should occur N/2 times, but there is some uncertainty because, in a given trial, you might get more heads than tails, for instance.
A picture may help to understand squeezing. Imagine a stick lying in a plane (Fig. 1a) such that its length represents the amplitude of the light and its angle represents the phase. The probabilistic nature of photons smears out the end of the stick, leading to uncertainty in the measurement of phase and amplitude. However, quantum theory tells us we can squeeze this uncertainty 'blob' in one direction and it will expand in another direction, just like a balloon filled with water. So, squeezing the uncertainty in amplitude leads to an increased phase uncertainty (Fig. 1b) — but that's all right if it is just the amplitude that we are trying to measure precisely.
Squeezing itself3 has been demonstrated many times in a regime in which squeezing harder just makes things better and better. But it has been predicted that, in systems with limited dimensions, squeezing harder can make things worse4. Shalm et al.2 have been able to investigate this fascinating situation experimentally by confining themselves to a quantum system composed of just three photons, which they mash together into the same region of time, space and frequency spectrum.
The quantity that Shalm and colleagues measure is not amplitude or phase, but rather polarization — a property that describes whether the electric-field vector of the light oscillates vertically, horizontally, in a circle, or any combination of the above. These three-photon polarization states are not represented by a stick in a plane, but rather by a stick going from the centre of a sphere to the surface, and the quantum uncertainty is represented by a blob on that surface (Fig. 2a, overleaf). It is now apparent what happens when squeezing occurs: although the uncertainty is reduced in one direction, the blob starts to wrap around the sphere in the other (Fig. 2b). Eventually it goes all the way around and meets up — although the three-fold symmetry leads to a rather more interesting pattern than one might initially expect (Fig. 2c). This over-squeezing is problematic, however. Rather than reducing the uncertainty in the specified direction, the over-squeezing has spread the initial uncertainty into three blobs equally spaced around a ring, meaning that it is now even more uncertain where the end of the stick is — that is to say, what the light's polarization is.
Does this mean that there is an optimal amount of squeezing — not too little and not too much? If one is limited to detecting all the photons in one bunch (an intensity measurement), the answer is 'yes'. But when the photons were counted one by one, Shalm et al. were able to reveal the correlations between them that give rise to the three fringes around the equator. As this maximally squeezed state is identical to the highly entangled N00N state proposed for quantum metrology5, there exist known adaptive-measurement algorithms to extract the complete information in an efficient way6.
So where does all of this leave us? Shalm and colleagues have elegantly demonstrated the connection between highly entangled states and the squeezing of quantum states by showing the continuum of quantum states with reduced measurement uncertainty. And this idea does not only apply to photons. Recently, similar effects were observed in an atomic spin system7. So, are physicists now the masters of quantum uncertainty? Well, not quite. It remains a difficult proposition to highly squeeze large numbers of quantum systems, and a few photons is a long way from the large entangled states required for practical application of quantum-enhanced precision measurement. But our control of the quantum world is always improving, and we may one day see optimum-precision measurements with large ensembles. In the meantime, we can look for applications of those squeezed and entangled states that can be made8,9,10. And we can admire the peculiar beauty and symmetries of the quantum world.