Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
The ethical use of publicly available datasets with human data for which consent has not been explicitly given needs urgent attention from researchers, funders, research institutes and publishers. A specific challenging case is research involving hacked data and this Perspective discusses whether and under what conditions it is morally and ethically justified to conduct such research.
Incorporating prior knowledge in deep learning models can overcome the difficulties of supervised learning, including the need for large amounts of annotated data. An approach in this area called deep reasoning networks is applied to the complex task of mapping crystal structures from X-ray diffraction data for multi-element oxide structures, and identified 13 phases from 307 X-ray diffraction patterns in the previously unsolved Bi-Cu-V oxide system.
The regulatory landscape for artificial intelligence (AI) is shaping up on both sides of the Atlantic, urgently awaited by the scientific and industrial community. Commonalities and differences start to crystallize in the approaches to AI in medicine.
Selecting interesting proton–proton collisions from the millions taking place each second in the Large Hadron Collider is a challenging task. A neural network optimized for a field-programmable gate array hardware enables 60 ns inference and reduces power consumption by a factor of 50.
Health disparities need to be addressed so that the benefits of medical progress are not limited to selected groups. Big data and machine learning approaches are transformative tools for public and population health, but need ongoing support from insights in algorithmic fairness.
In the past few years, AI approaches have been used to enhance Earth and climate modelling. This Perspective examines the opportunity to go further, and build from scratch hybrid systems that integrate AI tools and models based on physical process knowledge to make more efficient use of daily observational data streams.
Finding the optimum design of a complex auction is a challenging and important economic problem. Multi-agent deep learning can help find equilibria by making use of inherent symmetries in bidding strategies.
The relationship between brain organization, connectivity and computation is not well understood. The authors construct neuromorphic artificial neural networks endowed with biological connection patterns derived from diffusion-weighted imaging. The neuromorphic networks are trained to perform a memory task, revealing an interaction between network structure and dynamics.
Deep learning has revolutionized image analysis, but applying it to voluminous and multimodal images can be challenging. The authors propose a neural network approach with a modality sampling strategy and an attention module for segmentation in fluorescence microscopy images.
Radiomics has been used to discover imaging signatures that predict therapy response and outcomes, but clinical translation has been slow. Using machine learning methods, the authors report tumour subtypes that are applicable across major imaging modalities and three cancer types. The tumour subtypes have distinct radiological and molecular features, as well as survival outcomes after conventional therapies.
Auction games present an interesting challenge for multi-agent learning. Finding the Bayes Nash equilibria for optimum bidding strategies is intractable for numerical approaches. In a new, deep learning approach, strategies are represented as neural networks, and policy iteration based on gradient dynamics in self-play enables learning of local equilibria.
Algorithmic solutions to improve treatment are starting to transform health care. Mhasawade and colleagues discuss in this Perspective how machine learning applications in population and public health can extend beyond clinical practice. While working with general health data comes with its own challenges, most notably ensuring algorithmic fairness in the face of existing health disparities, the area provides new kinds of data and questions for the machine learning community.
Molecular simulations informed by experimental data can provide detailed knowledge of complex biomolecular structure. However, it is a challenging task to weight experimental information with respect to the underlying model. A self-adapting type of dynamic particle swarm optimization can tackle the parameter selection problem, which is demonstrated on small-angle X-ray scattering-guided protein simulations.
The COVID-19 pandemic is not over and the future is uncertain, but there has lately been a semblance of what life was like before. As thoughts turn to the possibility of a summer holiday, we offer suggestions for books and podcasts on AI to refresh the mind.
As highly automated systems become pervasive in society, enforceable governance principles are needed to ensure safe deployment. This Perspective proposes a pragmatic approach where independent audit of AI systems is central. The framework would embody three AAA governance principles: prospective risk Assessments, operation Audit trails and system Adherence to jurisdictional requirements.
Particle image velocimetry is an imaging technique to determine the velocity components of flow fields, of use in a range of complex engineering problems including in environmental, aerospace and biomedical engineering. A recurrent neural network-based approach for learning displacement fields in an end-to-end manner is applied to this technique and achieves state-of-the-art accuracy and, moreover, allows generalization to new data, eliminating the need for traditional handcrafted models.
A new international competition aims to speed up the development of AI models that can assist radiologists in detecting suspicious lesions from hundreds of millions of pixels in 3D mammograms. The top three winning teams compare notes.
Deep learning-based methods to generate new molecules can require huge amounts of data to train. Skinnider et al. show that models developed for natural language processing work well for generating molecules from small amounts of training data, and identify robust metrics to evaluate the quality of generated molecules.