Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
Machine learning-based neural network potentials often cannot describe long-range interactions. Here the authors present an approach for building neural network potentials that can describe the electronic and nuclear response of molecular systems to long-range electrostatics.
Reaction route planning remains a major challenge in organic synthesis. The authors present a retrosynthetic prediction model using the fragment-based representation of molecules and the Transformer architecture in neural machine translation.
Reinforcement learning algorithms are emerging as powerful machine learning approaches. This paper introduces a novel machine-learning approach for learning in continuous action space and applies this strategy to the generation of high dimensional potential models for a wide variety of materials.
Artificial intelligence is combined with quantum mechanics to break the limitations of traditional methods and create a new general-purpose method for computational chemistry simulations with high accuracy, speed and transferability.
Machine learning faces challenges in catalyst design due to its black-box nature. Here, the authors develop a theory-infused neural network approach that integrates deep learning algorithms with the well-established d-band theory of chemisorption for reactivity prediction of transition-metal surfaces.
Neural Networks are known to perform poorly outside of their training domain. Here the authors propose an inverse sampling strategy to train neural network potentials enabling to drive atomistic systems towards high-likelihood and high-uncertainty configurations without the need for molecular dynamics simulations.
Quantum mechanical calculations of molecular ionized states are computationally quite expensive. This work reports a successful extension of a previous deep-neural networks approach towards transferable neural-network models for predicting multiple properties of open shell anions and cations.
In organic chemistry, synthetic routes for new molecules are often specified in terms of reacting molecules only. The current work reports an artificial intelligence model to predict the full sequence of experimental operations for an arbitrary chemical equation.
Machine learning algorithms offer new possibilities for automating reaction procedures. The present paper investigates automated reaction’s prediction with Molecular Transformer, the state-of-the-art model for reaction prediction, proposing a new debiased dataset for a realistic assessment of the model’s performance.
Machine learning potentials do not account for long-range charge transfer. Here the authors introduce a fourth-generation high-dimensional neural network potential including non-local information of charge populations that is able to provide forces, charges and energies in excellent agreement with DFT data.
Semilocal density functionals, while computationally efficient, do not account for non-local correlation. Here, the authors propose a machine-learning approach to DFT that leads to non-local and transferable functionals applicable to non-covalent, ionic and covalent interactions across system of different sizes.
Development of algorithms to predict reactant and reagents given a target molecule is key to accelerate retrosynthesis approaches. Here the authors demonstrate that applying augmentation techniques to the SMILE representation of target data significantly improves the quality of the reaction predictions.
Atomistic simulations of phosphorus represent a challenge due to the element’s highly diverse allotropic structures. Here the authors propose a general-purpose machine-learning force field for elemental phosphorus, which can describe a broad range of relevant bulk and nanostructured allotropes.
Machine learning models insufficient for certain screening tasks can still provide valuable predictions in specific sub-domains of the considered materials. Here, the authors introduce a diagnostic tool to detect regions of low expected model error as demonstrated for the case of transparent conducting oxides.
At present there are databases with over 500,000 predicted or synthesized MOF structures, yet a method to establish whether a new material adds new information does not exist. Here the authors propose a machine-learning based approach to quantify the structural and chemical diversity in common MOF databases.
Extracting experimental operations for chemical synthesis from procedures reported in prose is a tedious task. Here the authors develop a deep-learning model based on the transformer architecture to translate experimental procedures from the field of organic chemistry into synthesis actions.
Increasing the non-locality of the exchange and correlation functional in DFT theory comes at a steep increase in computational cost. Here, the authors develop NeuralXC, a supervised machine learning approach to generate density functionals close to coupled-cluster level of accuracy yet computationally efficient.
The choice of molecular representations can severely impact the performances of machine-learning methods. Here the authors demonstrate a persistence homology based molecular representation through an active-learning approach for predicting CO2/N2 interaction energies at the density functional theory (DFT) level.
Exploring nucleation processes of gallium by molecular simulation is extremely challenging due to its structural complexity. Here the authors demonstrate a neural network potential trained on a multithermal–multibaric DFT data for the study of the phase diagram of gallium in a wide temperature and pressure range.
Despite the importance of neural-network quantum states, representing fermionic matter is yet to be fully achieved. Here the authors map fermionic degrees of freedom to spin ones and use neural-networks to perform electronic structure calculations on model diatomic molecules to achieve chemical accuracy.
Bond dissociation enthalpies are key quantities in determining chemical reactivity, their computations with quantum mechanical methods being highly demanding. Here the authors develop a machine learning approach to calculate accurate dissociation enthalpies for organic molecules with sub-second computational cost.
Machine learning models can accurately predict atomistic chemical properties but do not provide access to the molecular electronic structure. Here the authors use a deep learning approach to predict the quantum mechanical wavefunction at high efficiency from which other ground-state properties can be derived.
Computational modelling of chemical systems requires a balance between accuracy and computational cost. Here the authors use transfer learning to develop a general purpose neural network potential that approaches quantum-chemical accuracy for reaction thermochemistry, isomerization, and drug-like molecular torsions.
Understanding local dynamical processes in materials is challenging due to the complexity of the local atomic environments. Here the authors propose a graph dynamical networks approach that is shown to learn the atomic scale dynamics in arbitrary phases and environments from molecular dynamics simulations.
Traditional machine learning potentials suffer from poor transferability to unknown structures. Here the authors present an approach to improve the transferability of machine-learning potentials by including information on the physical nature of interatomic bonding.
A computationally efficient description of ice-water systems at the mesoscopic scale is challenging due to system size and timescale limitations. Here the authors develop a machine-learned coarse-grained water model to elucidate the ice nucleation process much more efficiently than previous models.
Simultaneous accurate and efficient prediction of molecular properties relies on combined quantum mechanics and machine learning approaches. Here the authors develop a flexible machine-learning force-field with high-level accuracy for molecular dynamics simulations.
Machine learning allows electronic structure calculations to access larger system sizes and, in dynamical simulations, longer time scales. Here, the authors perform such a simulation using a machine-learned density functional that avoids direct solution of the Kohn-Sham equations.