Abstract
Here we consider the possibility that a fundamental function of sensory cortex is the generation of an internal simulation of sensory environment in real-time. A logical elaboration of this idea leads to a dynamical neural architecture that oscillates between two fundamental network states, one driven by external input, and the other by recurrent synaptic drive in the absence of sensory input. Synaptic strength is modified by a proposed synaptic state matching (SSM) process that ensures equivalence of spike statistics between the two network states. Remarkably, SSM, operating locally at individual synapses, generates accurate and stable network-level predictive internal representations, enabling pattern completion and unsupervised feature detection from noisy sensory input. SSM is a biologically plausible substrate for learning and memory because it brings together sequence learning, feature detection, synaptic homeostasis, and network oscillations under a single parsimonious computational framework. Beyond its utility as a potential model of cortical computation, artificial networks based on this principle have remarkable capacity for internalizing dynamical systems, making them useful in a variety of application domains including time-series prediction and machine intelligence.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Tavazoie, S. Synaptic state matching: a dynamical architecture for predictive internal representation and feature perception. Nat Prec (2011). https://doi.org/10.1038/npre.2011.6282.1
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/npre.2011.6282.1
Keywords
- learning and memory
- sequence learning
- Neural networks
- cortex
- synaptic plasticity
- synaptic homeostasis
- feature detection
- pattern completion