Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
Inspired by human brain, neuromorphic computing technologies have made important breakthroughs in recent years as alternatives to overcome the power and latency shortfalls of traditional digital computing. An interdisciplinary approach is being taken to address the challenge of creating more efficient and intelligent computing systems that can perform diverse tasks, to design hardware with increasing complexity from single device to system architecture level, and to develop new theories and brain-inspired algorithms for future computing.
Edge and High-Performance Computing, Bio-Signal Processing and Brain-Computer Interface
We welcome the submissions of primary research that fall into any of the above-mentioned categories. All the submissions will be subject to the same peer review process and editorial standard as regular Nature Communications articles.
Combinatorial optimization problems can be solved on parallel hardware called Ising machines. Most studies have focused on the use of second-order Ising machines. Compared to second-order Ising machines, the authors show that higher-order Ising machines realized with coupled-oscillator networks can be more resource-efficient and provide superior solutions for constraint satisfaction problems.
Visual oddity tasks delve into the visual analytic intelligence of humans, which remained challenging for artificial neural networks. The authors propose here a model with biologically inspired neural dynamics and synthetic saccadic eye movements with improved efficiency and accuracy in solving the visual oddity tasks.
Inspired by human analogical reasoning in cognitive science, the authors propose an approach combining deep learning systems with an analogical reasoning mechanism, to detect abstract similarity in real-world images without intensive training in reasoning tasks.
The biological plausibility of backpropagation and its relationship with synaptic plasticity remain open questions. The authors propose a meta-learning approach to discover interpretable plasticity rules to train neural networks under biological constraints. The meta-learned rules boost the learning efficiency via bio-inspired synaptic plasticity.
Biologically inspired spiking neural networks are highly promising, but remain simplified omitting relevant biological details. The authors introduce here theoretical and numerical frameworks for incorporating dendritic features in spiking neural networks to improve their flexibility and performance.
Muscle electrophysiology is a promising tool for human-machine approaches in medicine and beyond clinical applications. The authors propose here a model simulating electric signals produced during human movements and apply this data for training of deep learning algorithms.
Hybrid neural networks combine advantages of spiking and artificial neural networks in the context of computing and biological motivation. The authors propose a design framework with hybrid units for improved flexibility and efficiency of hybrid neural networks, and modulation of hybrid information flows.
Reservoir computing has demonstrated high-level performance, however efficient hardware implementations demand an architecture with minimum system complexity. The authors propose a rotating neuron-based architecture for physically implementing all-analog resource efficient reservoir computing system.
Artificial neural networks are known to perform well on recently learned tasks, at the same time forgetting previously learned ones. The authors propose an unsupervised sleep replay algorithm to recover old tasks synaptic connectivity that may have been damaged after new task training.
Tasks involving continual learning and adaptation to real-time scenarios remain challenging for artificial neural networks in contrast to real brain. The authors propose here a brain-inspired optimizer based on mechanisms of synaptic integration and strength regulation for improved performance of both artificial and spiking neural networks.
Based on fundamental thermodynamics, traditional electronic computers, which operate serially, require more energy per computation the faster they operate. Here, the authors show that the energy cost per operation of a parallel computer can be kept very small.
Brain-inspired neural generative models can be designed to learn complex probability distributions from data. Here the authors propose a neural generative computational framework, inspired by the theory of predictive processing in the brain, that facilitates parallel computing for complex tasks.
Self-organizing maps are data mining tools for unsupervised learning algorithms dealing with big data problems. The authors experimentally demonstrate a memristor-based self-organizing map that is more efficient in computing speed and energy consumption for data clustering, image processing and solving optimization problems.
Deep learning techniques usually require a large quantity of training data and may be challenging for scarce datasets. The authors propose a framework that involves contrastive and transfer learning and reduces data requirements for training while keeping the prediction accuracy.
Dynamics of neural circuits mapping brain functions such as sensory processing and decision making, can be characterized by probabilistic representations and inference. The authors elaborate the role of spatiotemporal neural dynamics for more efficient performance of probabilistic computations.
Better understanding of a trade-off between the speed and accuracy of decision-making is relevant for mapping biological intelligence to machines. The authors introduce a brain-inspired learning algorithm to uncover dependencies in individual fMRI networks with features of neural activity and predict inter-individual differences in decision-making.
The modelling of human-like behaviours is one of the challenges in the field of Artificial Intelligence. Inspired by experimental studies of cultural evolution, the authors propose a reinforcement learning approach to generate agents capable of real-time third-person imitation.
Brain-inspired spiking neural networks have shown their capability for effective learning, however current models may not consider realistic heterogeneities present in the brain. The authors propose a neuron model with temporal dendritic heterogeneity for improved neuromorphic computing applications.
Brain connectivity patterns shape computational capacity of biological neural networks, however mapping empirically measured connectivity to artificial networks remains challenging. The authors present a toolbox for implementing biological neural networks as artificial reservoir networks. The toolbox allows for a variety of empirical/measured connectomes and is equipped with various dynamical systems, and cognitive tasks.
The task of planning a sequence of actions, and dynamically adjusting the plan in dependence of unforeseen circumstances, remains challenging for artificial intelligence frameworks. The authors introduce a learning approach inspired by cognitive functions, that demonstrates high flexibility and generalization capability in planning tasks, suitable for on-chip learning.