This might not come as a shock to the computational science community, but computational and mathematical modeling have been extremely powerful tools in science. Take the ongoing pandemic as a timely example. Epidemiologists and disease modeling experts have been working together to build mathematical models and run simulations to better understand how SARS-CoV-2 impacts populations. This, in turn, has helped governments with devising policies and non-pharmaceutical interventions to help slow the spread of the virus1. Physics- and machine learning-based models have also been used by the research community to study the virus and to find potential drug-related solutions to the disease2, as described in a Perspective in this issue. Needless to say, COVID-19 is just an example (albeit an important one): other areas of science, ranging from the physical sciences to the life sciences, have vastly benefited from in silico experiments to address various scientific problems. The increasing availability of high-performance computing resources, coupled with the development of new, efficient hardware architectures, has made it possible to perform more computationally expensive, detailed simulations in unprecedented ways, deepening our understanding of the world around us.

As these modeling capabilities become more comprehensive and quantitative, comparing this level of detail with experimental data (whenever possible) is a must, not only to validate the models, but also to motivate further development in science by making more reliable model predictions. In other words, collaborations between theorists and experimentalists are crucial for accelerating research.

Experiments can help theorists (who work with theoretical, mathematical and/or computational modeling) by providing a ‘reality check’ to their models. Designing detailed, realistic models requires experimental data in order to take into account precise information about the behavior of a system. Advances in experimental instrumentation in different fields have made it possible to collect an impressive amount of high-fidelity data that can be used to develop powerful models. The developmental biology field, for instance, has been able to implement incredibly detailed and insightful models thanks to advances in molecular biology and genome sequencing3. In materials science, multiscale modeling has flourished with the development of microscopy techniques that allow the imaging of the crystallographic structure of materials at an atomic scale4.

But theorists are not the only ones who can benefit from this synergy. Experimentalists can take advantage of the simplicity, efficiency and insights gained from the models to perform what-if analyses and narrow down the design of new studies. Computational screenings using physics-based simulations and machine learning can be used, for example, to guide the design of sustainable energy technologies5 and to examine drug repurposing candidates to guide clinical trials in response to new diseases2. Models can also be used when experiments are too dangerous6 or too expensive and time-consuming (for instance, addressing the protein folding problem) to be conducted.

Of course, this synergy might come with challenges, since each side has its own technical limitations. Theoretical predictions and experimental observations don’t always match, and this necessitates a careful evaluation from both sides through collaboration — and not through assigning blame! In addition, computational and mathematical modeling can advance substantially faster than experimentation, since, in the latter, researchers might need instruments not yet available to make their observations. At the same time, modeling can also take time to develop if more computational power is necessary to run comprehensive simulations. These challenges, however, can become opportunities. For example, theorists can build models for which experimental data are not available or are too noisy, and this can help experimentalists obtain insights into future developments.

While these collaborations might not be common, be it for lack of communication or lack of funding, the computational science community should champion them and strive for their establishment. We believe that this synergy has the power to advance research in a more streamlined fashion and, consequently, at a faster pace. The research community, and the world as a whole, can certainly benefit from this.