Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Volume 6 Issue 5, May 2024

Generating quantum circuits

Quantum computing promises to be a transformative technology, but there are several challenges in realizing quantum computing hardware. One is the generation of quantum circuits that perform desired operations. Denoising diffusion models excel at this task, providing a powerful and flexible method to create circuits in a variety of scenarios. Given a text prompt that describes a quantum operation, they rely on iteratively denoising an initially noisy canvas until the desired quantum circuit is reached.

See Fürrutter et al.

Image: Harald Ritsch. Cover design: Amie Fernandez


  • Personalized LLMs built with the capacity for emulating empathy are right around the corner. The effects on individual users need careful consideration.



Top of page ⤴


Top of page ⤴

Comment & Opinion

  • Most research efforts in machine learning focus on performance and are detached from an explanation of the behaviour of the model. We call for going back to basics of machine learning methods, with more focus on the development of a basic understanding grounded in statistical theory.

    • Diego Marcondes
    • Adilson Simonis
    • Junior Barrera
Top of page ⤴

Books & Arts

Top of page ⤴


  • The central assumption in machine learning that data are independent and identically distributed does not hold in many reinforcement learning settings, as experiences of reinforcement learning agents are sequential and intrinsically correlated in time. Berrueta and colleagues use the mathematical theory of ergodic processes to develop a reinforcement framework that can decorrelate agent experiences and is capable of learning in single-shot deployments.

    • Thomas A. Berrueta
    • Allison Pinosky
    • Todd D. Murphey
  • Achieving the promised advantages of quantum computing relies on translating quantum operations into physical realizations. Fürrutter and colleagues use diffusion models to create quantum circuits that are based on user specifications and tailored to experimental constraints.

    • Florian Fürrutter
    • Gorka Muñoz-Gil
    • Hans J. Briegel
  • Large language models can be queried to perform chain-of-thought reasoning on text descriptions of data or computational tools, which can enable flexible and autonomous workflows. Bran et al. developed ChemCrow, a GPT-4-based agent that has access to computational chemistry tools and a robotic chemistry platform, which can autonomously solve tasks for designing or synthesizing chemicals such as drugs or materials.

    • Andres M. Bran
    • Sam Cox
    • Philippe Schwaller
    Article Open Access
  • Deep learning has led to great advances in predicting protein structure from sequences. Ren and colleagues present here a method for the inverse problem of finding a sequence that results in a desired protein structure, which is inspired by various components of AlphaFold combined with Markov random fields to decode sequences more efficiently.

    • Milong Ren
    • Chungong Yu
    • Haicang Zhang
  • Despite the existence of various pretrained language models for nucleotide sequence analysis, achieving good performance on a broad range of downstream tasks using a single model is challenging. Wang and colleagues develop a pretrained language model specifically optimized for RNA sequence analysis and show that it can outperform state-of-the-art methods in a diverse set of downstream tasks.

    • Ning Wang
    • Jiang Bian
    • Haoyi Xiong
    Article Open Access
  • Methods for predicting molecular structure predictions have so far focused on only the most probable conformation, but molecular structures are dynamic and can change when performing their biological functions, for example. Zheng et al. use a graph transformer approach to learn the equilibrium distribution of molecular systems and show that this can be helpful for a number of downstream tasks, including protein structure prediction, ligand docking and molecular design.

    • Shuxin Zheng
    • Jiyan He
    • Tie-Yan Liu
    Article Open Access
  • Machine learning-based surrogate models are important to model complex systems at a reduced computational cost; however, they must often be re-evaluated and adapted for validity on future data. Diaw and colleagues propose an online training method leveraging optimizer-directed sampling to produce surrogate models that can be applied to any future data and demonstrate the approach on a dense nuclear-matter equation of state containing a phase transition.

    • A. Diaw
    • M. McKerns
    • M. S. Murillo
Top of page ⤴


Quick links