We present a special issue focusing on recent advances in computation- and theory-driven approaches to neuroscience that inform a host of biophysical and mechanistic models.
Focus on neural computation and theory
Deep understanding in neuroscience comes from the complementary embrace of theory and experiment. This has been the case in the past, and is only going to become truer in the future. The ability to perform increasingly large parallel measurements of neural activity has been one of the more exciting developments in experimental neuroscience. However, the increased data collection necessitates progressively sophisticated computational techniques and theoretical frameworks to inform our understanding. Nature Neuroscience is pleased to present a special issue focused on recent advances in computational and theoretical neuroscience that highlight the current thinking and unanswered questions on topics that include neural networks and coding, memory formation, sensory perception and decision-making, and psychiatric illness.
Theoretical approaches have long shaped neuroscience, but current needs for theory are elevated and prospects for advancement are bright. Advances in measuring and manipulating neurons demand new models and analyses to guide interpretation. Advances in theoretical neuroscience offer new insights into how signals evolve across areas and new approaches for connecting population activity with behavior. These advances point to a global understanding of brain function based on a hybrid of diverse approaches.
The networks used by computer scientists and by modelers in neuroscience frequently consider unit activities as continuous. Neurons, however, communicate primarily through discontinuous spiking. This Perspective offers a unifying view of the current methods for transferring our ability to construct functional networks from continuous to more realistic spiking network models.
Recent computational neuroscience developments have used deep neural networks to model neural responses in higher visual areas. This Perspective describes key algorithmic underpinnings in computer vision and artificial intelligence that have contributed to this progress and outlines how deep networks could drive future improvements in understanding sensory cortical processing.
The authors use recent probabilistic theories of neural computation to argue that confidence and certainty are not identical concepts. They propose precise mathematical definitions for both of these concepts and discuss putative neural representations.
Despite representing a minority of cortical cells, inhibitory neurons deeply shape cortical responses. Inhibitory currents closely track excitatory currents, opening only brief windows of opportunity for a neuron to fire. This explains the variability of cortical spike trains, but may also, paradoxically, render a spiking network maximally efficient and precise.
The state of the nervous system shifts constantly. Most studies focus on how state determines the average neural response, with little attention to the trial-to-trial fluctuations of brain activity. We review recent theoretical advances in modeling the physiological mechanisms responsible for state-dependent modulations in the correlated fluctuations of neuronal populations.
What are the challenges associated with storing information over time in the brain? Here the authors explore the computational principles by which biological memory might be built. They develop a high-level view of shared problems and themes in short- and long-term memory and highlight questions for future research.
The complexity of problems and data in psychiatry requires powerful computational approaches. Computational psychiatry is an emerging field encompassing mechanistic theory-driven models and theoretically agnostic data-driven analyses that use machine-learning techniques. Clinical applications will benefit from relating theoretically meaningful process variables to complex psychiatric outcomes through data-driven techniques.