Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Fast and energy-efficient neuromorphic deep learning with first-spike times


For a biological agent operating under environmental pressure, energy consumption and reaction times are of critical importance. Similarly, engineered systems are optimized for short time-to-solution and low energy-to-solution characteristics. At the level of neuronal implementation, this implies achieving the desired results with as few and as early spikes as possible. With time-to-first-spike coding, both of these goals are inherently emerging features of learning. Here, we describe a rigorous derivation of a learning rule for such first-spike times in networks of leaky integrate-and-fire neurons, relying solely on input and output spike times, and show how this mechanism can implement error backpropagation in hierarchical spiking networks. Furthermore, we emulate our framework on the BrainScaleS-2 neuromorphic system and demonstrate its capability of harnessing the system’s speed and energy characteristics. Finally, we examine how our approach generalizes to other neuromorphic platforms by studying how its performance is affected by typical distortive effects induced by neuromorphic substrates.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Time-to-first-spike coding and learning.
Fig. 2: Classification of the Yin-Yang dataset.
Fig. 3: Classification of the MNIST dataset.
Fig. 4: Classification on the BrainScaleS-2 neuromorphic platform.
Fig. 5: Effects of substrate imperfections.

Similar content being viewed by others

Data availability

We used the MNIST66 and the Yin-Yang dataset65. For the latter, see

Code availability

Code for the simulations81 is available at


  1. Krizhevsky, A., Sutskever, I. & Hinton, G. E. ImageNet classification with deep convolutional neural networks. In Advances in Neural Information Processing Systems 1097–1105 (NIPS, 2012).

  2. Silver, D. et al. Mastering the game of Go without human knowledge. Nature 550, 354–359 (2017).

    Article  Google Scholar 

  3. Brown, T. B. et al. Language models are few-shot learners. Preprint at (2020).

  4. Brooks, R., Hassabis, D., Bray, D. & Shashua, A. Is the brain a good model for machine intelligence?. Nature 482, 462–463 (2012).

    Article  Google Scholar 

  5. Ng, A. What artificial intelligence can and can’t do right now. Harvard Business Review (9 November 2016).

  6. Hassabis, D., Kumaran, D., Summerfield, C. & Botvinick, M. Neuroscience-inspired artificial intelligence. Neuron 95, 245–258 (2017).

    Article  Google Scholar 

  7. Sejnowski, T. J. The Deep Learning Revolution (MIT Press, 2018).

  8. Richards, B. A. et al. A deep learning framework for neuroscience. Nat. Neurosci. 22, 1761–1770 (2019).

    Article  Google Scholar 

  9. Pfeiffer, M. & Pfeil, T. Deep learning with spiking neurons: opportunities and challenges. Front. Neurosci 12, 774 (2018).

    Article  Google Scholar 

  10. Gerstner, W. What is different with spiking neurons? In Plausible Neural Networks for Biological Modelling. Mathematical Modelling: Theory and Applications Vol 13. (eds Mastebroek, H. A. K. & Vos, J. E.) 23–48 (Springer, 2001).

  11. Izhikevich, E. M. Which model to use for cortical spiking neurons? IEEE Trans. Neural Netw. 15, 1063–1070 (2004).

    Article  Google Scholar 

  12. Gerstner, W. Spiking Neurons (MIT Press, 1998).

  13. Maass, W. Searching for principles of brain computation. Curr. Opin. Behav. Sci. 11, 81–92 (2016).

    Article  Google Scholar 

  14. Davies, M. Benchmarks for progress in neuromorphic computing. Nat. Mach. Intell. 1, 386–388 (2019).

    Article  Google Scholar 

  15. Linnainmaa, S. The Representation of the Cumulative Rounding Error of an Algorithm as a Taylor Expansion of the Local Rounding Errors. Master’s thesis (in Finnish), Univ. Helsinki 6–7 (1970).

  16. Werbos, P. J. Applications of advances in nonlinear sensitivity analysis. In System Modeling and Optimization. Lecture Notes in Control and Information Sciences Vol. 38 (eds Drenick, R. F. & Kozin, F.) 762–770 (Springer, 1982).

  17. Rumelhart, D. E., Hinton, G. E. & Williams, R. J. Learning representations by back-propagating errors. Nature 323, 533–536 (1986).

    Article  MATH  Google Scholar 

  18. Tavanaei, A., Ghodrati, M., Kheradpisheh, S. R., Masquelier, T. & Maida, A. Deep learning in spiking neural networks. Neural Netw 111, 47–63 (2018).

    Article  Google Scholar 

  19. Neftci, E. O., Mostafa, H. & Zenke, F. Surrogate gradient learning in spiking neural networks: Bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Process. 36, 51–63 (2019).

    Article  Google Scholar 

  20. Gütig, R. & Sompolinsky, H. The tempotron: a neuron that learns spike timing-based decisions. Nat. Neurosci. 9, 420–428 (2006).

    Article  Google Scholar 

  21. Cao, Y., Chen, Y. & Khosla, D. Spiking deep convolutional neural networks for energy-efficient object recognition. Int. J. Comput. Vis. 113, 54–66 (2015).

    Article  MathSciNet  Google Scholar 

  22. Diehl, P. U., Zarrella, G., Cassidy, A., Pedroni, B. U. & Neftci, E. Conversion of artificial recurrent neural networks to spiking neural networks for low-power neuromorphic hardware. In Proc. 2016 IEEE International Conference on Rebooting Computing (ICRC) 1–8 (IEEE, 2016).

  23. Schmitt, S. et al. Neuromorphic hardware in the loop: training a deep spiking network on the BrainScaleS wafer-scale system. In Proc. 2017 International Joint Conference on Neural Networks (IJCNN) 2227–2234 (2017).

  24. Wu, J., Chua, Y., Zhang, M., Yang, Q., Li, G., & Li, H. Deep spiking neural network with spike count based learning rule. In International Joint Conference on Neural Networks 1–6 (IEEE, 2019).

  25. Thakur, C. S. T. et al. Large-scale neuromorphic spiking array processors: a quest to mimic the brain. Front. Neurosci. 12, 891 (2018).

    Article  Google Scholar 

  26. Mead, C. Neuromorphic electronic systems. Proc. IEEE 78, 1629–1636 (1990).

    Article  Google Scholar 

  27. Roy, K., Jaiswal, A. & Panda, P. Towards spike-based machine intelligence with neuromorphic computing. Nature 575, 607–617 (2019).

    Article  Google Scholar 

  28. Petrovici, M. A., Bill, J., Bytschok, I., Schemmel, J. & Meier, K. Stochastic inference with deterministic spiking neurons. Preprint at (2013).

  29. Neftci, E., Das, S., Pedroni, B., Kreutz-Delgado, K. & Cauwenberghs, G. Event-driven contrastive divergence for spiking neuromorphic systems. Front. Neurosci. 7, 272 (2014).

    Article  Google Scholar 

  30. Petrovici, M. A., Bill, J., Bytschok, I., Schemmel, J. & Meier, K. Stochastic inference with spiking neurons in the high-conductance state. Phys. Rev. E 94, 042312 (2016).

    Article  MathSciNet  Google Scholar 

  31. Neftci, E. O., Pedroni, B. U., Joshi, S., Al-Shedivat, M. & Cauwenberghs, G. Stochastic synapses enable efficient brain-inspired learning machines. Front. Neurosci. 10, 241 (2016).

    Article  Google Scholar 

  32. Leng, L. et al. Spiking neurons with short-term synaptic plasticity form superior generative networks. Sci. Rep. 8, 10651 (2018).

    Article  Google Scholar 

  33. Kungl, A. F. et al. Accelerated physical emulation of Bayesian inference in spiking neural networks. Front. Neurosci. 13, 1201 (2019).

    Article  Google Scholar 

  34. Dold, D. et al. Stochasticity from function-why the Bayesian brain may need no noise. Neural Netw. 119, 200–213 (2019).

    Article  Google Scholar 

  35. Jordan, J. et al. Deterministic networks for probabilistic computing. Sci. Rep. 9, 18303 (2019).

    Article  Google Scholar 

  36. Hunsberger, E. & Eliasmith, C. Training spiking deep networks for neuromorphic hardware. Preprint at (2016).

  37. Kheradpisheh, S. R., Ganjtabesh, M., Thorpe, S. J. & Masquelier, T. STDP-based spiking deep convolutional neural networks for object recognition. Neural Netw. 99, 56–67 (2018).

    Article  Google Scholar 

  38. Illing, B., Gerstner, W. & Brea, J. Biologically plausible deep learning-but how far can we go with shallow networks?. Neural Netw 118, 90–101 (2019).

    Article  Google Scholar 

  39. Bohte, S. M., Kok, J. N. & La Poutré, J. A. Spikeprop: backpropagation for networks of spiking neurons. In 8th European Symposium on Artificial Neural Networks 419–424 (2000).

  40. Zenke, F. & Ganguli, S. Superspike: supervised learning in multilayer spiking neural networks. Neural Comput. 30, 1514–1541 (2018).

    Article  MathSciNet  Google Scholar 

  41. Huh, D. & Sejnowski, T. J. Gradient descent for spiking neural networks. In Advances in Neural Information Processing Systems Vol. 31, 1433–1443 (NIPS, 2018).

  42. Thorpe, S., Delorme, A. & Van Rullen, R. Spike-based strategies for rapid processing. Neural Netw. 14, 715–725 (2001).

    Article  Google Scholar 

  43. Thorpe, S., Fize, D. & Marlot, C. Speed of processing in the human visual system. Nature 381, 520–522 (1996).

    Article  Google Scholar 

  44. Johansson, R. S. & Birznieks, I. First spikes in ensembles of human tactile afferents code complex spatial fingertip events. Nat. Neurosci. 7, 170–177 (2004).

    Article  Google Scholar 

  45. Gollisch, T. & Meister, M. Rapid neural coding in the retina with relative spike latencies. Science 319, 1108–1111 (2008).

    Article  Google Scholar 

  46. Schemmel, J. et al. A wafer-scale neuromorphic hardware system for large-scale neural modeling. In Proc. 2010 IEEE International Symposium on Circuits and Systems 1947–1950 (IEEE, 2010).

  47. Akopyan, F. et al. TrueNorth: design and tool flow of a 65 mW 1 million neuron programmable neurosynaptic chip. IEEE Trans. Comput. Aided Design Integrated Circuits Syst. 34, 1537–1557 (2015).

    Article  Google Scholar 

  48. Billaudelle, S. et al. Versatile emulation of spiking neural networks on an accelerated neuromorphic substrate. In IEEE International Symposium on Circuits and Systems 1–5 (IEEE, 2020).

  49. Davies, M. et al. Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 38, 82–99 (2018).

    Article  Google Scholar 

  50. Mayr, C., Höppner, S., & Furber, S. SpiNNaker 2: a 10 million core processor system for brain simulation and machine learning-keynote presentation. In Communicating Process Architectures 2017 & 2018 277–280 (IOS Press, 2019).

  51. Pei, J. et al. Towards artificial general intelligence with hybrid Tianjic chip architecture. Nature 572, 106–111 (2019).

    Article  Google Scholar 

  52. Moradi, S., Qiao, N., Stefanini, F. & Indiveri, G. A scalable multicore architecture with heterogeneous memory structures for dynamic neuromorphic asynchronous processors (dynaps). IEEE Trans. Biomed. Circuits Syst. 12, 106–122 (2017).

    Article  Google Scholar 

  53. Mostafa, H. Supervised learning based on temporal coding in spiking neural networks. IEEE Trans. Neural Netw. Learn. Syst. 29, 3227–3235 (2017).

    Google Scholar 

  54. Kheradpisheh, S. R. & Masquelier, T. S4NN: temporal backpropagation for spiking neural networks with one spike per neuron. Int. J. Neural Syst. 30, 2050027 (2020).

    Article  Google Scholar 

  55. Rauch, A., La Camera, G., Luscher, H.-R., Senn, W. & Fusi, S. Neocortical pyramidal cells respond as integrate-and-fire neurons to in vivo-like input currents. J. Neurophysiol. 90, 1598–1612 (2003).

    Article  Google Scholar 

  56. Gerstner, W. & Naud, R. How good are neuron models? Science 326, 379–380 (2009).

    Article  Google Scholar 

  57. Teeter, C. et al. Generalized leaky integrate-and-fire models classify multiple neuron types. Nat. Commun. 9, 709 (2018).

    Article  Google Scholar 

  58. Göltz, J. Training Deep Networks with Time-to-First-Spike Coding on the BrainScaleS Wafer-Scale System. Master’s thesis, Universität Heidelberg (2019);

  59. Friedmann, S. et al. Demonstrating hybrid learning in a flexible neuromorphic hardware system. IEEE Trans. Biomed. Circuits Syst. 11, 128–142 (2017).

    Article  Google Scholar 

  60. Prodromakis, T. & Toumazou, C. A review on memristive devices and applications. In Proc. 2010 17th IEEE International Conference on Electronics, Circuits and Systems 934–937 (IEEE, 2010).

  61. Esser, S. K., Appuswamy, R., Merolla, P., Arthur, J. V. & Modha, D. S. Backpropagation for energy-efficient neuromorphic computing. In Advances in Neural Information Processing Systems 1117–1125 (NIPS, 2015).

  62. van De Burgt, Y., Melianas, A., Keene, S. T., Malliaras, G. & Salleo, A. Organic electronics for neuromorphic computing. Nat. Electron. 1, 386–397 (2018).

    Article  Google Scholar 

  63. Wunderlich, T. et al. Demonstrating advantages of neuromorphic computation: a pilot study. Front. Neurosci. 13, 260 (2019).

    Article  Google Scholar 

  64. Feldmann, J., Youngblood, N., Wright, C., Bhaskaran, H. & Pernice, W. All-optical spiking neurosynaptic networks with self-learning capabilities. Nature 569, 208–214 (2019).

    Article  Google Scholar 

  65. Kriener, L., Göltz, J. & Petrovici, M. A. The yin-yang dataset. Preprint at (2021).

  66. LeCun, Y., Bottou, L., Bengio, Y. & Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 86, 2278–2324 (1998).

    Article  Google Scholar 

  67. Schemmel, J., Billaudelle, S., Dauer, P. & Weis, J. Accelerated analog neuromorphic computing. Preprint at (2020).

  68. Comsa, I. M. et al. Temporal coding in spiking neural networks with alpha synaptic function. In Proc. 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 8529–8533 (IEEE, 2020).

  69. Tavanaei, A., Kirby, Z. & Maida, A. S. Training spiking ConvNets by STDP and gradient descent. In Proc. 2018 International Joint Conference on Neural Networks (IJCNN) 1–8 (IEEE, 2018).

  70. Aamir, S. A. et al. An accelerated LIF neuronal network array for a large-scale mixed-signal neuromorphic architecture. IEEE Trans. Circuits Syst. I Regular Papers 65, 4299–4312 (2018).

    Article  Google Scholar 

  71. Petrovici, M. A. et al. Characterization and compensation of network-level anomalies in mixed-signal neuromorphic modeling platforms. PLoS ONE 9, e108590 (2014).

    Article  Google Scholar 

  72. Cramer, B. et al. Training spiking multi-layer networks with surrogate gradients on an analog neuromorphic substrate. Preprint at (2020).

  73. Petrovici, M. A. Form Versus Function: Theory and Models for Neuronal Substrates (Springer, 2016).

  74. Hubara, I., Courbariaux, M., Soudry, D., El-Yaniv, R. & Bengio, Y. Quantized neural networks: training neural networks with low precision weights and activations. J. Mach. Learn. Res. 18, 6869–6898 (2017).

    MathSciNet  MATH  Google Scholar 

  75. Payeur, A., Guerguiev, J., Zenke, F., Richards, B. A. & Naud, R. Burst-dependent synaptic plasticity can coordinate learning in hierarchical circuits. Preprint at bioRxiv (2020).

  76. Sacramento, J., Ponte Costa, R., Bengio, Y. & Senn, W. Dendritic cortical microcircuits approximate the backpropagation algorithm. In Advances in Neural Information Processing Systems Vol. 31, 8721–8732 (NIPS, 2018).

  77. Aamir, S. A. et al. A mixed-signal structured AdEx neuron for accelerated neuromorphic cores. IEEE Trans. Biomed. Circuits Syst. 12, 1027–1037 (2018).

    Article  Google Scholar 

  78. Müller, E. et al. Extending BrainScaleS OS for BrainscaleS-2. Preprint at (2020).

  79. Paszke, A. et al. PyTorch: an imperative style, high-performance deep learning library. In Advances in Neural Information Processing Systems Vol. 32, 8024–8035 (NIPS, 2019).

  80. Kingma, D. P. & Ba, J. Adam: a method for stochastic optimization. Preprint at (2014).

  81. Göltz, J. et al. Fast and energy-efficient neuromorphic deep learning with first-spike times (Zenodo, 2021);

  82. Stromatias, E. et al. Scalable energy-efficient, low-latency implementations of trained spiking deep belief networks on SpiNNaker. In Proc. 2015 International Joint Conference on Neural Networks (IJCNN) 1–8 (2015).

  83. Renner, A., Sheldon, F., Zlotnik, A., Tao, L. & Sornborger, A. The backpropagation algorithm implemented on spiking neuromorphic hardware. Preprint at (2021).

  84. Chen, G. K., Kumar, R., Sumbul, H. E., Knag, P. C. & Krishnamurthy, R. K. A 4096-neuron 1M-synapse 3.8-pJ/SOP spiking neural network with on-chip STDP learning and sparse weights in 10-nm FinFET CMOS. IEEE J. Solid State Circuits 54, 992–1002 (2018).

    Article  Google Scholar 

Download references


We thank J. Jordan and N. Gürtler for valuable discussions, S. Schmitt for assistance with BrainScaleS-1, V. Karasenko, P. Spilger and Y. Stradmann for taming physics, as well as M. Davies and Intel for their ongoing support (L.K., W.S., M.A.P.). Some calculations were performed on UBELIX, the HPC cluster at the University of Bern. Our work has greatly benefitted from access to the Fenix Infrastructure resources, which are partially funded from the European Union’s Horizon 2020 research and innovation programme through the ICEI project under grant agreement no. 800858. Some simulations were performed on the bwForCluster NEMO, supported by the state of Baden–Württemberg through bwHPC and the German Research Foundation (DFG) through grant no. INST 39/963-1 FUGG. We gratefully acknowledge funding from the European Union for the Human Brain Project under grant agreements 604102 (J.S., K.M., M.A.P.), 720270 (S.B., O.B., B.C., J.S., K.M., M.A.P.), 785907 (S.B., O.B., B.C., W.S., J.S., K.M., M.A.P.), 945539 (L.K., A.B., S.B., O.B., B.C., W.S., J.S., M.A.P.) and the Manfred Stärk Foundation (J.G., A.B., D.D., A.F.K., K.M., M.A.P.).

Author information

Authors and Affiliations



J.G., A.B. and M.A.P. designed the conceptual and experimental approach. J.G. derived the theory, implemented the algorithm and performed the hardware experiments. L.K. embedded the algorithm into a comprehensive training framework and performed the simulation experiments. A.B. and O.B. offered substantial software support. S.B., B.C., J.G. and A.F.K. provided low-level software for interfacing with the hardware. J.G., L.K., D.D., S.B. and M.A.P. wrote the manuscript.

Corresponding authors

Correspondence to J. Göltz, L. Kriener or M. A. Petrovici.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Peer review information Nature Machine Intelligence thanks the anonymous reviewers for their contribution to the peer review of this work.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Supplementary text with sections SI.A to SI.F, six figures (SI.A1, SI.C1, SI.D1, SI.E1, SI.E2, SI.F1) and four tables (SI.B1, SI.F1-3).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Göltz, J., Kriener, L., Baumbach, A. et al. Fast and energy-efficient neuromorphic deep learning with first-spike times. Nat Mach Intell 3, 823–835 (2021).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:

This article is cited by


Quick links

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics