Materials modelling: The frontiers and the challenges

Journal name:
Nature Materials
Volume:
15,
Pages:
381–382
Year published:
DOI:
doi:10.1038/nmat4613
Published online

Materials simulations have become a dominant force in the world of science and technology. The intellectual challenges lying ahead to sustain such a paradigm shift are discussed.

Computer simulations are ubiquitous in contemporary research in materials science. The reasons are manifold and encompass the intellectual excitement of deriving macroscopic performance from microscopic foundations, the capability to test and refute different hypotheses for an observed property, the predictive power that quantum-mechanical simulations have brought to the field, and the design strategies that high-throughput computing can uncover. The Thomas Young Centre, the London Centre for the Theory and Simulation of Materials was created ten years ago to act as a focal point and a think tank for the current discourse in the field, and has performed admirably in this role. It was thus appropriate and fitting to celebrate this anniversary with a roadmap symposium on the Frontiers of Materials Modelling, to brainstorm on current successes, failures, insights and needs.

Each of the ten talks in the symposium illuminated different corners of the materials world: atomistic simulations, multiscale modelling and self-assembly; applications in energy harvesting, metal plasticity and protein biochemistry; methods that deal with the complexity of quantum mechanics and methods that try to machine learn that complexity. Fundamental science was not spared, and it was shown how to rule out multi-Higgs theories by looking very carefully at multiferroics. Two cross-cutting themes recurred: the first one underlined how joint experimental and theoretical work is becoming not only prevalent, but often a necessity because simulations have reached that long-promised stage where they can deliver a much-needed edge in insight and understanding. The second one highlighted materials design, with more and more efforts going into engineering interactions or systematically exploring all variations of a material in the search for a desired structure, property or functionality.

Dominant throughout the symposium was the realization of how pervasive quantum mechanical techniques have become and how far they have advanced — on their own or part of multiscale frameworks. In fact, as recently remarked in Nature1, when one looks at the most cited papers of all time in all fields, one finds that 12 of the top 100 papers deal with the foundations, the refinements and the algorithms of density functional theory. This surprising reformulation of quantum mechanics was invented by Walter Kohn and collaborators more than 50 years ago and is now the engine under the hood of many or most materials simulations. It is quite remarkable, then, that this has become one of the most widespread endeavours in all of science and technology. A glimpse of the shape of things to come was laid out for everyone long ago with the 1982 prediction of the structural properties and phase stability of silicon and germanium, made in Berkeley by Marvin Cohen and collaborators2. This was enlightening for the work that followed — since then, the development of robust, reliable codes and the availability of inexpensive computing power has led to the systematic application of those pioneering techniques to explore a landscape that had shown itself for the first time back then.

So, what is in store for the future of materials simulations? I find it useful to focus on the three core challenges of predictive accuracy, realistic complexity and materials informatics.

Predictive accuracy remains, first and foremost, the hardest to master. Quantum chemistry has largely based itself on wavefunction theories — these can often be systematically improved, but with exploding costs. Materials simulations have greatly benefited from the opposite view of relying on approximate energy functionals that are computationally inexpensive, but become saddled with qualitative failures that are most glaring in some of the most useful functional materials, such as mixed-valence transition metal oxides. The future will bring us much-needed reference data on difficult but small systems that can act as hardcore benchmarks, more refined functional theories that can deal with some of the fundamental qualitative failures of current approximations, and a stronger interplay between spectral theories that focus on the correct description of excitations and ground-state formulations.

Realistic complexity is where most of the progress in materials simulations has taken place in the last 30 years. Rather than limiting ourselves to static calculations we can now add the environmental effects of temperature, pressure, pH, and chemical and electrochemical potentials, to name just a few. We can predict spectroscopic and microscopic data — from infrared and Raman spectra to nuclear magnetic resonance, electronic paramagnetic resonance or angle-resolved photoemission spectroscopy — and contrast these predictions against experimental observations. First-principles calculations can bridge length scales by providing input for mesoscopic or macroscopic theories (see Fig. 1 for an example), by being directly coupled to coarser-grained formulations, or by delivering extensive training datasets of microscopic information to machine learn atomic interactions. Powerful techniques able to accelerate configurational sampling — from metadynamics to replica exchange — can overcome the timescale bottlenecks of simulating rare events.

Figure 1: Electrical resistivity of graphene as a function of temperature and doping (ρ, electrical resistivity; T, temperature; n, carrier density).
Electrical resistivity of graphene as a function of temperature and doping ([rho], electrical resistivity; T, temperature; n, carrier density).

Left panel: first-principles results obtained using a combination of density-functional perturbation theory, many-body perturbation theory and Wannier interpolations to solve the Boltzmann transport equation. Right panel: experimental data. Adapted from ref. 4, American Chemical Society.

Last, the capability of performing thousands to hundreds of thousands of calculations automatically provides us with an extensive vista of the materials landscape — thus, systematic database-driven, database-filling protocols can be unleashed to explore the empty quarters of materials space, data analytics can be applied to discover new insights or correlations, and systematic community efforts at curating and verifying materials properties can be formulated.

In all of this, maybe the greatest challenge looming is an educational one. Simulation science thrives on development and innovation, rather than execution, and requires a steady mastery of the fundamentals of condensed-matter physics, statistical mechanics, physical chemistry and materials science, supported by the knowledge of applied-math numerical techniques, advanced programming models and ever-evolving computer architectures. Traditional departments are far away from offering this focus and this mix, and so there is both the opportunity and the urgent need for education models that focus on twenty-first century polymaths. Thomas Young, whose milestone contributions more than 200 years ago covered many different fields in science, medicine and even linguistics3, would have certainly been pleased with such a thought.

References

  1. Van Noorden, R., Maher, B. & Nuzzo, R. Nature 514, 550553 (2014).
  2. Yin, M. T. & Cohen, M. L. Phys. Rev. B 26, 5668 (1982).
  3. Robinson, A. The Last Man Who Knew Everything (Penguin, 2007).
  4. Park, C.-H. et al. Nano Lett. 14, 11131119 (2014).

Download references

Author information

Affiliations

  1. Nicola Marzari is at the École Polytechnique Fédérale de Lausanne (EPFL), CH-1015 Lausanne, Switzerland

Corresponding author

Correspondence to:

Author details

Additional data