Every year, as October approaches, excitement and speculation about the announcement of the newest Nobel laureates ripples through the scientific community. By spotlighting work that has had the “greatest benefit to humankind”, the Nobel Prize never fails to inspire scientists in all fields, and to reignite a passion for research and scientific progress. For computational scientists, this inspiration can come in many forms, as the realm of possibilities for applying computational tools continues to grow.

Credit: zhencong chen / Alamy Stock Photo

Last year, we looked back at the history of the Nobel Prize and asked how computational science influenced, directly and indirectly, previously awarded topics, as well as how it has been explicitly acknowledged with awards. In this Focus issue, we have extended those discussions in conversation with various experts — including Nobel laureates, researchers who worked with Nobel laureates in the past, and a Nobel Prize Committee member — in order to not only celebrate the diversity of computational science contributions to the fields of chemistry and physics, but also to further look into the future and at the challenges that lie ahead of us. While many influential models have been recognized with a Nobel Prize, here we focus on contributions that have made existing theory practically computable, sometimes with only limited computational power available at the time.

One of the awards that we highlight is the 1998 Nobel Prize in Chemistry, which recognized two pioneers in the field of quantum chemistry: Walter Kohn and John Pople. Kohn was recognized for the development of density functional theory (DFT), while Pople was recognized for the development of computational methods in quantum chemistry. Practically speaking, Kohn’s work made quantum chemical calculations computationally feasible. Pople’s quantum chemistry software — Gaussian — transformed the field of computational chemistry, as it enabled researchers to theoretically study molecules, their properties, and their interactions, practically and effectively bringing Kohn’s DFT to life. We had the opportunity to speak with two researchers who worked with these Nobel laureates in the past, and who, in their own right, have greatly influenced the field as well: Lu Sham and Martin Head-Gordon.

Lu Sham worked closely with Kohn on what is known as the Kohn–Sham equations, which proposed a simplified approach to the kinetic energy approximation and made DFT a more practical tool for answering questions in materials science and chemistry. This work largely contributed to Kohn’s Nobel Prize, and Sham did not expect that this would be so influential when they were originally working on this theory: “In the beginning, I did not think it would blossom as much as it did,” noted Sham during our conversation with him. Indeed, their work was an important, seminal contribution to the scientific community, and as such, there are still many challenges yet to be addressed. For instance, one such challenge is the use of DFT methods in quantum materials and strongly correlated systems. A Comment by Alex Zunger discusses this challenge in more detail, as well as the potential and opportunities to bridge the gap between DFT and quantum materials.

Martin Head-Gordon — who was mentored by Pople during his PhD — is known in the field of chemistry for his work on DFT and density functionals, as well as for his contributions to developing Gaussian and, later, Q-Chem. Head-Gordon’s career has been profoundly influenced by his time working with Pople: “He has been the biggest single scientific influence on me,” stated Head-Gordon. Today, Gaussian and Q-Chem are still two of the most widely used commercially available quantum chemistry software packages. For Q-Chem, there are still many untapped horizons to be investigated, as discussed by Head-Gordon: “We are moving towards more complex systems in a variety of ways, while we seek to improve core algorithms as our main mission.”

We also highlight, in this Focus issue, the 2013 Nobel Prize in Chemistry, which recognized Arieh Warshel, Michael Levitt, and Martin Karplus for their developments of multiscale models for complex chemical systems. The establishment of quantum mechanics/molecular mechanics (QM/MM) methods has allowed scientists in the field to accurately model large systems in a computationally tractable way. We spoke with Warshel and discussed his current research, as well as some of the challenges that he has faced in his research career. For example, he noted that it has been difficult to “convince people that computers are the only way to definitely understand how enzymes work.” Nevertheless, the work of Warshel, Levitt, and Karplus became widely accepted by the research community: “What captivated the public relatively fast was the simple idea of separating QM and MM, rather than a way to make it more accurate,” said Warshel. Indeed, this simplicity has helped QM/MM methods remain indispensable for computational scientists across various disciplines.

Important contributions of the computational science community to the physics domain are also discussed in this issue. We had the opportunity to speak with Saul Perlmutter, a 2011 Nobel Laureate in Physics. Perlmutter received the Nobel Prize for the discovery of the accelerating expansion of the Universe through observations of distant supernovae. Part of the work entailed identifying tens of thousands of galaxies from wide-field images, and then identifying appearances of supernovae in those galaxies. As Perlmutter discussed with Nature Computational Science, “This was a perfect job for a computer to do.” The work required a feat of computational prowess, and for Perlmutter, the timing could not have been better: “It was just the right time, technologically speaking, to do this,” asserted Perlmutter. “Computational technology was such a key part of this work.”

More recently, in 2021, the Nobel Prize recognized Giorgio Parisi, Syukuro Manabe, and Klaus Hasselmann for their work on understanding complex physical systems, such as the Earth’s climate, and making them practically computable. We had a chance to speak with a member of the Nobel Committee for Physics, John Wettlaufer, who provided a peek behind the curtain of the Nobel Committee’s decision by discussing how their contributions stood out. To further advance the field and address the challenges related to climate change, Wettlaufer pointed out that the computational emphasis should be placed on data and data-driven approaches, which requires multidisciplinary collaborations: “This sounds like a cliché, but it really doesn’t work if people don’t speak each other’s languages,” noted Wettlaufer. A Comment by Mojib Latif also discusses the ongoing challenges and how advanced Earth system and global climate models can further answer pressing questions to mitigate anthropogenic effects on climate change and global warming.

Other Nobel Prize awards may not have directly recognized the contributions of the computational science community, but their corresponding research has been greatly enriched by computation, such as the 2020 Nobel Prize in Chemistry, which honored Emmanuelle Charpentier and Jennifer Doudna for their development of a method for genome editing using the CRISPR–Cas9 genetic scissors. A Comment by Lei Stanley Qi — who had Doudna as one of his academic advisors during his PhD — discusses how computational analysis has aided in the discovery of CRISPR systems through the understanding of CRISPR’s function of generic immunity to viral infection, and how computational science empowers further CRISPR developments as a genome editing tool.

Interestingly, there were some common themes discussed in these conversations and pieces. One of the recurring messages was the importance of a collaborative relationship between experimentalists and theorists. For instance, Sham stated that DFT can be used as a first run to further guide experimental work, and Perlmutter noted that experiments and observations have helped to solidify computational predictions in his field. But, as Head-Gordon noted, while a feedback cycle between theory and experimentation is important, it does not come without challenges, such as trying to make sure as much as possible that what is being modeled is also seen experimentally. Another recurring message was the importance of multidisciplinary research: for instance, Wettlaufer pointed out that multidisciplinary collaborations are required to make the most out of data-driven climate research, while Qi noted that computational tools developed in other areas, such as protein structure prediction algorithms, could substantially increase the potential of CRISPR technology. These commonalities across the discussions in this issue suggest that, despite the fact that the contributions highlighted here come from different fields and are varied in nature (from modeling and theory to software development), they all have related features and face similar challenges, which reflect the nature of computational science research.

In anticipation of the 2022 Nobel Prize announcements, which will take place during 3–10 October, we invite you to explore our Focus issue and its many conversations and commentaries on how computational science contributions have shaped science and paved the way for future developments.