Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
As deep neural networks are pushed towards larger and more complex architectures, they require significant computational resources and are challenging to deploy in real-time applications. To reduce complexity, neural networks can be compressed, without a substantial decrease in model accuracy, by methods such as pruning or quantization. The latter involves using fewer bits to represent weights and biases. In a paper in this issue, Coelho Jr. et al. develop a method for producing quantized versions of deep neural network models for fully automated deployment on an FPGA chip with high-accuracy, low-energy and nanosecond inference, which is required for the real-time event-selection procedure in proton–proton collisions at the CERN Large Hadron Collider.
Health disparities need to be addressed so that the benefits of medical progress are not limited to selected groups. Big data and machine learning approaches are transformative tools for public and population health, but need ongoing support from insights in algorithmic fairness.
Selecting interesting proton–proton collisions from the millions taking place each second in the Large Hadron Collider is a challenging task. A neural network optimized for a field-programmable gate array hardware enables 60 ns inference and reduces power consumption by a factor of 50.
Finding the optimum design of a complex auction is a challenging and important economic problem. Multi-agent deep learning can help find equilibria by making use of inherent symmetries in bidding strategies.
Algorithmic solutions to improve treatment are starting to transform health care. Mhasawade and colleagues discuss in this Perspective how machine learning applications in population and public health can extend beyond clinical practice. While working with general health data comes with its own challenges, most notably ensuring algorithmic fairness in the face of existing health disparities, the area provides new kinds of data and questions for the machine learning community.
In the past few years, AI approaches have been used to enhance Earth and climate modelling. This Perspective examines the opportunity to go further, and build from scratch hybrid systems that integrate AI tools and models based on physical process knowledge to make more efficient use of daily observational data streams.
With edge computing on custom hardware, real-time inference with deep neural networks can reach the nanosecond timescale. An important application in this regime is event processing at particle collision detectors like those at the Large Hadron Collider (LHC). To ensure high performance as well as reduced resource consumption, a method is developed, and made available as an extension of the Keras library, to automatically design optimal quantization of the different layers in a deep neural network.
Auction games present an interesting challenge for multi-agent learning. Finding the Bayes Nash equilibria for optimum bidding strategies is intractable for numerical approaches. In a new, deep learning approach, strategies are represented as neural networks, and policy iteration based on gradient dynamics in self-play enables learning of local equilibria.
The response of the body to drugs follows complex dynamical processes that can be difficult to predict. Lu and colleagues combine a neural network approach with pharmacokinetic/pharmacodynamic modelling to learn these complex dynamics.
Single-cell RNA sequencing efforts have made large amounts of data available for transcriptomics research. Simon and colleagues develop a neural network embedding approach that avoids batch effects, such that it can rapidly and efficiently integrate large datasets from different studies.
Methods are available to support clinical decisions regarding adjuvant therapies in breast cancer, but they have limitations in accuracy, generalizability and interpretability. Alaa et al. present an automated machine learning model of breast cancer that predicts patient survival and adjuvant treatment benefit to guide personalized therapeutic decisions.
Molecular simulations informed by experimental data can provide detailed knowledge of complex biomolecular structure. However, it is a challenging task to weight experimental information with respect to the underlying model. A self-adapting type of dynamic particle swarm optimization can tackle the parameter selection problem, which is demonstrated on small-angle X-ray scattering-guided protein simulations.
A new international competition aims to speed up the development of AI models that can assist radiologists in detecting suspicious lesions from hundreds of millions of pixels in 3D mammograms. The top three winning teams compare notes.