A recent phase of excitement in quantum computing and quantum machine learning has attracted substantial funding to develop the technology, with big tech companies such as NVIDIA, Amazon, Microsoft, Google and IBM conducting fundamental and applied research in this emerging field, in addition to start-ups and academic labs. The excitement is undoubtedly caused at least partly by several experimental demonstrations in recent years that have provided evidence of a quantum computational advantage in specific tasks — examples in which classical computers would be able to complete the same task only in a substantially, impractically longer time1,2.

Could a revolution in quantum computing indeed be underway as advances in quantum algorithms and device technology are continuing apace? Realizing suitable hardware remains a hurdle. There is no shortage of approaches to building qubits — the building blocks of a quantum computer — with some of the most common ones being based on trapped ions, superconducting circuits and quantum optical systems, although silicon-based options3 and phonon-based options4 are also currently under investigation. However, a general challenge is that qubit quantum states are easily disturbed during interaction with other qubits and the rest of the environment. Errors in qubit operation can be corrected with well-developed protocols, but these require large amounts of qubits5. The reality is that the fabrication of quantum chips with sufficient numbers of qubits and sufficiently low error rates for full-scale fault-tolerant quantum computing is currently out of reach.

Resigned to this state of affairs, the present phase of quantum computing is called the ‘noisy intermediate-scale quantum’ (NISQ) era, which refers to devices that operate with noisy qubits that are not error corrected and, therefore, have limited, imperfect computational capabilities. Currently, the most powerful quantum processor, made by IBM, has 433 qubits, and the company says it will release a quantum processor with more than 1,000 qubits later this year6.

NISQ devices usually consist of hybrid architectures, wherein parts of the computations are carried out by quantum systems while other tasks are performed by classical processors. Some of these devices can be accessed and manipulated via internet-based cloud services and are often used in proof-of-principle experiments. NISQ devices have a limited promise for the sort of full-scale quantum computing applications that were originally envisioned (for example, in factoring large numbers). However, recent research shows that many specialized but still useful tasks can be identified that are realistically feasible with current NISQ devices.

An area of active research is speeding up machine learning with NISQ devices7. One of the first experimental implementations of quantum supervised machine learning used a chip of five qubits made from superconducting circuits and employed two types of algorithms: quantum kernel estimation and quantum variational classifier8. Another application is in quantum reinforcement learning, for which quantum speed-up has been experimentally demonstrated on a nano-photonic processor by means of a quantum communication channel between the learning agent and the environment9. Adversarial quantum machine learning is also attracting increasing interest, as discussed by West et al. in a recent Perspective article10. Taking a different angle on the near-term use of NISQ devices, an active research direction is to harness machine learning for the control of quantum states in simulations of complex quantum systems, as shown by Metz and Bukov in this issue of Nature Machine Intelligence.

Although the field of quantum machine learning is clearly progressing, researchers are venturing into largely uncharted territory, with many roadblocks that need to be overcome before the technology can become practically useful. One technical bottleneck consists of representing classical data as quantum states. Quantum algorithms therefore seem especially likely to offer quantum computational advantage in applications in which the data handled and processed are natively quantum, as is the case for simulations of quantum chemistry and quantum solid state systems in regimes that are difficult to represent on classical computers.

It is key to be honest about what quantum computers can do, now and in the near future, and whether they present an advantage over classical computing approaches — the latter are also advancing, constantly raising the bar. Claims about current and imminent advances in quantum devices need to be made with a clear understanding of the inherent limitations of the current quantum hardware. At the same time, the field should be encouraged to broaden the range of problems for which quantum computing and quantum machine learning can be used.