Abstract
We describe a new learning procedure, back-propagation, for networks of neurone-like units. The procedure repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector. As a result of the weight adjustments, internal ‘hidden’ units which are not part of the input or output come to represent important features of the task domain, and the regularities in the task are captured by the interactions of these units. The ability to create useful new features distinguishes back-propagation from earlier, simpler methods such as the perceptron-convergence procedure1.
This is a preview of subscription content, access via your institution
Access options
Subscribe to this journal
Receive 51 print issues and online access
$199.00 per year
only $3.90 per issue
Buy this article
- Purchase on SpringerLink
- Instant access to full article PDF
Prices may be subject to local taxes which are calculated during checkout
Similar content being viewed by others
References
Rosenblatt, F. Principles of Neurodynamics (Spartan, Washington, DC, 1961).
Minsky, M. L. & Papert, S. Perceptrons (MIT, Cambridge, 1969).
Le Cun, Y. Proc. Cognitiva 85, 599–604 (1985).
Rumelhart, D. E., Hinton, G. E. & Williams, R. J. in Parallel Distributed Processing: Explorations in the Microstructure of Cognition. Vol. 1: Foundations (eds Rumelhart, D. E. & McClelland, J. L.) 318–362 (MIT, Cambridge, 1986).
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Rumelhart, D., Hinton, G. & Williams, R. Learning representations by back-propagating errors. Nature 323, 533–536 (1986). https://doi.org/10.1038/323533a0
Received:
Accepted:
Issue Date:
DOI: https://doi.org/10.1038/323533a0
This article is cited by
-
Deep learning based approaches from semantic point clouds to semantic BIM models for heritage digital twin
Heritage Science (2024)
-
Multi-sample \(\zeta \)-mixup: richer, more realistic synthetic samples from a p-series interpolant
Journal of Big Data (2024)
-
De-noising classification method for financial time series based on ICEEMDAN and wavelet threshold, and its application
EURASIP Journal on Advances in Signal Processing (2024)
-
A proposed PMU-based voltage stability and critical bus detection method using artificial neural network
Energy Informatics (2024)
-
Cancelable templates for secure face verification based on deep learning and random projections
EURASIP Journal on Information Security (2024)
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.