November 2023 Issue

November issue now live

Lagemann, K., Lagemann, C., Taschler, B. et al. Deep learning of causal structures in high dimensions under data limitations.

Nature Machine Intelligence is a Transformative Journal; authors can publish using the traditional publishing route OR via immediate gold Open Access.

Our Open Access option complies with funder and institutional requirements.


  • Data-driven surrogate models are used in computational physics and engineering to greatly speed up evaluations of the properties of partial differential equations, but they come with a heavy computational cost associated with training. Pestourie et al. combine a low-fidelity physics model with a generative deep neural network and demonstrate improved accuracy–cost trade-offs compared with standard deep neural networks and high-fidelity numerical solvers.

    • Raphaël Pestourie
    • Youssef Mroueh
    • Steven G. Johnson
  • Single-cell transcriptomics has provided a powerful approach to investigate cellular properties at unprecedented resolution. Sha et al. have developed an optimal transport-based algorithm called TIGON that can connect transcriptomic snapshots from different time points to obtain collective dynamical information, including cell population growth and the underlying gene regulatory network.

    • Yutong Sha
    • Yuchi Qiu
    • Qing Nie
    ArticleOpen Access
  • A fundamental question in neuroscience is what are the constraints that shape the structural and functional organization of the brain. By bringing biological cost constraints into the optimization process of artificial neural networks, Achterberg, Akarca and colleagues uncover the joint principle underlying a large set of neuroscientific findings.

    • Jascha Achterberg
    • Danyal Akarca
    • Duncan E. Astle
    ArticleOpen Access
  • Deep learning is a powerful method to process large datasets, and shown to be useful in many scientific fields, but models are highly parameterized and there are often challenges in interpretation and generalization. David Gleich and colleagues develop a method rooted in computational topology, starting with a graph-based topological representation of the data, to help assess and diagnose predictions from deep learning and other complex prediction methods.

    • Meng Liu
    • Tamal K. Dey
    • David F. Gleich
    ArticleOpen Access
  • Continual learning is an innate ability in biological intelligence to accommodate real-world changes, but it remains challenging for artificial intelligence. Wang, Zhang and colleagues model key mechanisms of a biological learning system, in particular active forgetting and parallel modularity, to incorporate neuro-inspired adaptability to improve continual learning in artificial intelligence systems.

    • Liyuan Wang
    • Xingxing Zhang
    • Yi Zhong
  • Further progress in AI may require learning algorithms to generate their own data rather than assimilate static datasets. A Perspective in this issue proposes that they could do so by interacting with other learning agents in a socially structured way.

  • The rise of artificial intelligence (AI) has relied on an increasing demand for energy, which threatens to outweigh its promised positive effects. To steer AI onto a more sustainable path, quantifying and comparing its energy consumption is key.

    • Charlotte Debus
    • Marie Piraud
    • Markus Götz
  • AI-generated media are on the rise and are here to stay. Regulation is urgently needed, but in the meantime creators, users and content distributors need to pursue various ways, and adopt various tools, for responsible generation, sharing and detection of AI-generated content.

  • Advances in DNA nanoengineering promise the development of new computing devices within biological systems, with applications in nanoscale sensing, diagnostics and therapeutics.