Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
Neural architecture search for computational genomics
Applying deep learning models requires the tuning of network architectures for optimum performance, which can require substantial machine learning expertise. In this issue, Zijun Zhang et al. present a fully automated framework, AMBER, to design and apply convolutional neural networks for genomic sequences using neural architecture search. In an accompanying News & Views, Yi Zhang, Yang Liu and X. Shirley Liu discuss the AMBER technique and its potential to improve deep learning models in genomics.
A white paper from Partnership on AI provides timely advice on tackling the urgent challenge of navigating risks of AI research and responsible publication.
We spoke with Mariarosaria Taddeo, an associate professor and senior research fellow at the Oxford Internet Institute and Dstl Ethics Fellow at the Alan Turing Institute working on digital and AI ethics about two recent reports from the UK and the US on using AI in national defence and security.
Deep learning applied to genomics can learn patterns in biological sequences, but designing such models requires expertise and effort. Recent work demonstrates the efficiency of a neural network architecture search algorithm in optimizing genomic models.
A challenge for multiscale simulations is how to link the macroscopic and microscopic length scales effectively. A new machine-learning-based sampling approach enables full exploration of macro configurations while retaining the precision of a microscale model.
State of the art neural network approaches enable massive multilingual translation. How close are we to universal translation between any spoken, written or signed language?
Modern machine learning approaches, such as deep neural networks, generalize well despite interpolating noisy data, in contrast with textbook wisdom. Mitra describes the phenomenon of statistically consistent interpolation (SCI) to clarify why data interpolation succeeds, and discusses how SCI elucidates the differing approaches to modelling natural phenomena represented in modern machine learning, traditional physical theory and biological brains.
At present, deep learning models in genomics are manually tuned through trial and error, which is time consuming and imposes a barrier for biomedical researchers not trained in machine learning. The authors develop an automated framework to design and apply convolutional neural networks for genomic sequences.
Tackling scientific problems often requires computational models that bridge several spatial and temporal scales. A new simulation framework employing machine learning, which is scalable and can be used on standard laptops as well as supercomputers, promises exhaustive multiscale explorations.
The morphology of a robot determines how efficiently it can traverse different terrain. Nygaard and colleagues present here a robot that can adapt it’s morphology when it is detecting different terrain and learn which configuration is most effective.
Increased resolution of mass spectroscopy can provide better data for sequencing, but also increases the computational complexity of analysing the data. Qiao and colleagues present here a neural network-based method that processes sequencing data of any resolution while improving the accuracy of predicted sequences.
Identifying salient input features can be a challenge in neural networks. The authors developed a variable selection procedure with false discovery rate control that works on classification or regression problems, one or multiple output neurons, and deep or shallow neural networks.
Predicting what comes next in a previously unseen time series of input data is a challenging task for machine learning. A novel unsupervised learning scheme termed predictive principal component analysis can extract the most informative components for predicting future inputs with low computational cost.
Rechargeable lithium-ion batteries play a crucial role in many modern-day applications, including portable electronics and electric vehicles, but they degrade over time. To ensure safe operation, a battery’s ‘state of health’ should be monitored in real time, and this machine learning pipeline, tested on a variety of charging conditions, can provide such an online estimation of battery state of health.