While classical computing is well established and powerful in its own right, it cannot address several problems above a certain level of complexity. Quantum computing on the other hand can, in principle, go beyond these limitations as it relies on exotic physical phenomena (such as superposition and entanglement of quantum bits) to perform complex computation protocols and solve significantly more demanding problems in a faster and more reliable way. Several industries are starting to notice too: banks for example are looking into using quantum algorithms to solve computationally heavy financial problems, such as portfolio optimization.

Photograph of the interferometric part of the Jiuzhang experiment. Credit: Chaoyang Lu.

There is now a well-established ecosystem of academics, start-ups and companies (such as Google, IBM, and Microsoft) supported by several national and international funding initiatives, working on software and hardware. At the moment several platforms — such as superconducting qubits, trapped ions, and photons — are being developed for quantum computation and information protocols, some in more advanced stages than others.

A major result from the Quantum Artificial Intelligence Lab was published in 20191. The Google quantum computer, called Sycamore, is based on 53 superconducting quantum bits (qubits) kept at 30 mK. Sycamore managed to obtain a million samples of the output of a pseudo-random quantum circuit in 200 s, whereas a supercomputer following classical algorithms would need ~10,000 years, according to their estimates. This was the first time a quantum machine was shown to outperform conventional computers on at least one specific task, and thus the first ever demonstration of quantum advantage. The result was however challenged shortly after by the IBM team2, who estimated that improved classical algorithms should be able to complete the same task within just a few days instead of thousands of years.

A more powerful second demonstration of quantum advantage was reported a few months ago3, based on photons at room temperature this time. The team of Jian-Wei Pan and Chao-Yang Lu, from the University of Science and Technology of China (USTC), focused on solving a different task, called boson sampling. It involves sending several indistinguishable bosons (particles that follows Bose–Einstein statistics with integer values of spin) through the ports of a linear interferometer consisting of many mirrors and beamsplitters. If two identical bosons hit the same beamsplitter at the same moment, they will take the same exit route in their path. As the photons (which are bosons) go through the system, their positions and paths get randomized. Aaronson and Arkhipov argued about a decade ago that a classical computer would be unable to calculate the output of such a system, and thus boson sampling can be used to demonstrate quantum advantage4. The USTC team had already reported boson sampling with 14 detected photons5 in 2019, but they managed to scale their setup up to 40–70 photons in 2020.

The details of their setup alone are impressive — the very dense and well-aligned optical table is pictured. The USTC team built an interferometer with 100 inputs and outputs with 300 beamsplitters and 75 mirrors. They fed it with 50 indistinguishable squeezed photons originating from parametric down-conversion and used 100 highly efficient single photon detectors to measure the output. Their 76-photon-based quantum computer, called Jiuzhang after an ancient Chinese mathematical handbook, was able to provide an answer in 200 s, which makes it 1014 times faster than classical supercomputers (the Sunway TaihuLight supercomputer would need 2.5 billion years for the same calculation).

Much like the Google claim6, the community was quick to scrutinize the photonic quantum advantage claim. Some researchers7 pointed out that the USTC authors may have also underestimated the capabilities of classical computers on this particular problem. The main drawbacks of their implementation are optical losses — a large part of the input photons (up to 70%) get lost as they pass through the beamsplitters, which means that only the remaining 30% can be detected. The high loss rate and subsequent noise could make it possible for classical sampling to be enough to simulate the quantum system8,9.

Chao-Yang Lu remarks that quantum computational advantage should not be considered as a static, single-shot achievement, but is instead a long-term process, as both the classical simulation strategies and quantum devices continue to improve. He hopes that Jiuzhang’s result will stimulate more powerful classical simulations. Meanwhile, the USTC team is currently upgrading their system and collecting data with Jiuzhang 2.0 with lower photon loss and higher output photon number beyond 100.

Jiuzhang has the disadvantage of not being an easily scalable architecture, and unlike Google’s Sycamore, it is only partially programmable: for the moment only the phase and amplitude of the input squeezed states can be reconfigured, and the team will look into changing the beam splitting ratio and phase in the near future. Based on this ‘partial programmability’, they aim to demonstrate a quantum machine-learning application later this year.

These results confirm that yet another quantum computing platform is able to go beyond the limitations of classical computers.

These remarkable achievements underline that quantum computing is a very active field of research (and business of lately). We should all be ready for even more exciting results to come.