Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Accurate online training of dynamical spiking neural networks through Forward Propagation Through Time

A preprint version of the article is available at arXiv.


With recent advances in learning algorithms, recurrent networks of spiking neurons are achieving performance that is competitive with vanilla recurrent neural networks. However, these algorithms are limited to small networks of simple spiking neurons and modest-length temporal sequences, as they impose high memory requirements, have difficulty training complex neuron models and are incompatible with online learning. Here, we show how the recently developed Forward-Propagation Through Time (FPTT) learning combined with novel liquid time-constant spiking neurons resolves these limitations. Applying FPTT to networks of such complex spiking neurons, we demonstrate online learning of exceedingly long sequences while outperforming current online methods and approaching or outperforming offline methods on temporal classification tasks. The efficiency and robustness of FPTT enable us to directly train a deep and performant spiking neural network for joint object localization and recognition, demonstrating the ability to train large-scale dynamic and complex spiking neural network architectures.

This is a preview of subscription content, access via your institution

Access options

Rent or buy this article

Prices vary by article type



Prices may be subject to local taxes which are calculated during checkout

Fig. 1: FPTT and LTC spiking neurons.
Fig. 2: Performance evaluation of SNNs.
Fig. 3: Spiking-YOLO-v4 (SPYv4).

Data availability

This paper utilized publicly available datasets. The MNIST dataset can be accessed at, while the Fashion MNIST dataset is available at The DVS-GESTURE dataset can be download at, and the DVS-Cifar10 dataset can be obtained from The PASCAL VOC07 and VOC2012 datasets are publicly available at It is important to note that no software was used in the collection of data. Source data are provided with this paper.

Code availability

The code used in the study is publicly available from the GitHub repository and Zenodo45 (


  1. Yin, B., Corradi, F. & Bohte, S. M. Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nat. Mach. Intell. 3, 905–913 (2021).

    Article  Google Scholar 

  2. Stuijt, J., Sifalakis, M., Yousefzadeh, A. & Corradi, F. μBrain: an event-driven and fully synthesizable architecture for spiking neural networks. Front. Neurosci. 15, 538 (2021).

    Article  Google Scholar 

  3. Perez-Nieves, N., Leung, V. C. H., Dragotti, P. L. & Goodman, D. F. M. Neural heterogeneity promotes robust learning. Nat. Commun. 12, 5791 (2021).

    Article  Google Scholar 

  4. Keijser, J. & Sprekeler, H. Interneuron diversity is required for compartment-specific feedback inhibition. Preprint at bioRxiv (2020).

  5. Bohte, S. M. Error-backpropagation in networks of fractionally predictive spiking neurons. In International Conference on Artificial Neural Networks 60–68 (Springer, 2011).

  6. Neftci, E. O., Mostafa, H. & Zenke, F. Surrogate gradient learning in spiking neural networks: bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Process. Mag. 36, 51–63 (2019).

    Article  Google Scholar 

  7. Paszke, A. et al. PyTorch: an imperative style, high-performance deep learning library. Adv. Neural Inf. Process. Syst. 32, 8026–8037 (2019).

    Google Scholar 

  8. Kag, A. & Saligrama, V. Training recurrent neural networks via forward propagation through time. In International Conference on Machine Learning 5189–5200 (PMLR, 2021).

  9. Mehonic, A. & Kenyon, A. J. Brain-inspired computing needs a master plan. Nature 604, 255–260 (2022).

    Article  Google Scholar 

  10. Williams, R. J. & Zipser, D. A learning algorithm for continually running fully recurrent neural networks. Neural Comput. 1, 270–280 (1989).

    Article  Google Scholar 

  11. Bellec, G. et al. A solution to the learning dilemma for recurrent networks of spiking neurons. Nat. Commun. 11, 3625 (2020).

    Article  Google Scholar 

  12. Bohnstingl, T., Woźniak, S., Pantazi, A. & Eleftheriou, E. Online spatio-temporal learning in deep neural networks. In IEEE Transactions on Neural Networks and Learning Systems (IEEE, 2022).

  13. He, Y. et al. A 28.2 μC neuromorphic sensing system featuring SNN-based near-sensor computation and event-driven body-channel communication for insertable cardiac monitoring. In 2021 IEEE Asian Solid-State Circuits Conference (IEEE, 2021).

  14. Hasani, R., Lechner, M., Amini, A., Rus, D. & Grosu, R. Liquid time-constant networks. In Proceedings of the AAAI Conference on Artificial Intelligence Vol. 35, 7657–7666 (AAAI, 2021).

  15. Amir, A. et al. A low power, fully event-based gesture recognition system. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition 7243–7252 (IEEE, 2017).

  16. Fang, W. et al. Incorporating learnable membrane time constant to enhance learning of spiking neural networks. In Proceedings of the IEEE/CVF International Conference on Computer Vision 2661–2671 (IEEE, 2021).

  17. Everingham, M., Van Gool, L., Williams, C. K., Winn, J. & Zisserman, A. The PASCAL visual object classes (VOC) challenge. Int. J. Comput. Vision 88, 303–338 (2010).

    Article  Google Scholar 

  18. Kim, S., Park, S., Na, B. & Yoon, S. Spiking-YOLO: spiking neural network for energy-efficient object detection. In Proceedings of the AAAI Conference on Artificial Intelligence Vol. 34, 11270–11277 (AAAI, 2020).

  19. Chakraborty, B., She, X. & Mukhopadhyay, S. A fully spiking hybrid neural network for energy-efficient object detection. IEEE Trans. Image Process. 30, 9014–9029 (2021).

    Article  Google Scholar 

  20. Royo-Miquel, J., Tolu, S., Schöller, F. E. & Galeazzi, R. RetinaNet object detector based on analog-to-spiking neural network conversion. In 8th International Conference on Soft Computing and Machine Intelligence (IEEE, 2021).

  21. Zhou, S., Chen, Y., Li, X. & Sanyal, A. Deep SCNN-based real-time object detection for self-driving vehicles using lidar temporal data. IEEE Access 8, 76903–76912 (2020).

    Article  Google Scholar 

  22. Jiang, Z., Zhao, L., Li, S. & Jia, Y. Real-time object detection method based on improved YOLOv4-tiny. Preprint at (2020).

  23. Werbos, P. J. Backpropagation through time: what it does and how to do it. Proc. IEEE 78, 1550–1560 (1990).

    Article  Google Scholar 

  24. Elman, J. L. Finding structure in time. Cognit. Sci. 14, 179–211 (1990).

    Article  Google Scholar 

  25. Mozer, M. C. Neural net architectures for temporal sequence processing. In Santa Fe Institute Studies on the Sciences of Complexity Proceedings Vol. 15, 243 (Addison-Wesley, 1993).

  26. Murray, J. M. Local online learning in recurrent networks with random feedback. eLife 8, e43299 (2019).

    Article  Google Scholar 

  27. Knight, J. C. & Nowotny, T. Efficient GPU training of LSNNs using eProp. In Neuro-Inspired Computational Elements Conference 8–10 (Association for Computing Machinery, 2022).

  28. Bohte, S. M., Kok, J. N. & La Poutre, H. Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48, 17–37 (2002).

    Article  MATH  Google Scholar 

  29. Yin, B., Corradi, F. & Bohté, S. M. Effective and efficient computation with multiple-timescale spiking recurrent neural networks. In International Conference on Neuromorphic Systems (Association for Computing Machinery, 2020).

  30. Scherr, F. & Maass, W. Analysis of the computational strategy of a detailed laminar cortical microcircuit model for solving the image-change-detection task. Preprint at bioRxiv (2021).

  31. Hochreiter, S. & Schmidhuber, J. Long short-term memory. Neural Comput. 9, 1735–1780 (1997).

    Article  Google Scholar 

  32. Li, H., Liu, H., Ji, X., Li, G. & Shi, L. CIFAR10-DVS: an event-stream dataset for object classification. Front. Neurosci. 11, 309 (2017).

    Article  Google Scholar 

  33. Gerstner, W., Kreiter, A. K., Markram, H. & Herz, A. V. Neural codes: firing rates and beyond. Proc. Natl Acad. Sci. USA 94, 12740–12741 (1997).

    Article  Google Scholar 

  34. Bochkovskiy, A., Wang, C.-Y. & Liao, H.-Y. M. YOLOv4: optimal speed and accuracy of object detection. Preprint at (2020).

  35. Kalchbrenner, N. et al. Efficient neural audio synthesis. In International Conference on Machine Learning 2410–2419 (PMLR, 2018).

  36. Sacramento, J., Ponte Costa, R., Bengio, Y. & Senn, W. Dendritic cortical microcircuits approximate the backpropagation algorithm. Adv. Neural Inf. Process. Syst. 31, 8721–8732 (2018).

    Google Scholar 

  37. Beniaguev, D., Segev, I. & London, M. Single cortical neurons as deep artificial neural networks. Neuron 109, 2727–2739 (2021).

    Article  Google Scholar 

  38. Larkum, M. E., Senn, W. & Lüscher, H.-R. Top-down dendritic input increases the gain of layer 5 pyramidal neurons. Cereb. Cortex 14, 1059–1070 (2004).

    Article  Google Scholar 

  39. Frey, U. & Morris, R. G. Synaptic tagging and long-term potentiation. Nature 385, 533–536 (1997).

    Article  Google Scholar 

  40. Moncada, D., Ballarini, F., Martinez, M. C., Frey, J. U. & Viola, H. Identification of transmitter systems and learning tag molecules involved in behavioral tagging during memory formation. Proc. Natl Acad. Sci. USA 108, 12931–12936 (2011).

    Article  Google Scholar 

  41. Rombouts, J. O., Bohte, S. M. & Roelfsema, P. R. How attention can create synaptic tags for the learning of working memories in sequential tasks. PLoS Comput. Biol. 11, e1004060 (2015).

    Article  Google Scholar 

  42. Pozzi, I., Bohte, S. & Roelfsema, P. Attention-gated brain propagation: how the brain can implement reward-based error backpropagation. In Adv. Neural Inf. Process. Syst. 33, 2516–2526 (2020).

    Google Scholar 

  43. Scellier, B. & Bengio, Y. Equilibrium propagation: bridging the gap between energy-based models and backpropagation. Front. Comput. Neurosci. 11, 24 (2017).

    Article  Google Scholar 

  44. Kingma, D. P. & Ba, J. Adam: a method for stochastic optimization. In 3rd International Conference on Learning Representations (ICLR) 1–15 (2015).

  45. Yin, B. byin-cwi/sFPTT: Training SNN via FPTT. Zenodo (2023).

  46. Woźniak, S., Pantazi, A., Bohnstingl, T. & Eleftheriou, E. Deep learning incorporating biologically inspired neural dynamics and in-memory computing. Nat. Mach. Intell. 2, 325–336 (2020).

    Article  Google Scholar 

  47. Zou, Z. et al. Memory-inspired spiking hyperdimensional network for robust online learning. Sci. Rep. 12, 7641 (2022).

    Article  Google Scholar 

  48. Shrestha, A., Fang, H., Wu, Q. & Qiu, Q. Approximating back-propagation for a biologically plausible local learning rule in spiking neural networks. In Proceedings of the International Conference on Neuromorphic Systems (Association for Computing Machinery, 2019).

  49. Kaiser, J., Mostafa, H. & Neftci, E. Synaptic plasticity dynamics for deep continuous local learning (DECOLLE). Front. Neurosci. 14, 424 (2020).

    Article  Google Scholar 

Download references


B.Y. is supported by the Nederlandse Organisatie voor Wetenschappelijk Onderzoek, Toegepaste en Technische Wetenschappen (NWO-TTW) Programme ‘Efficient Deep Learning’ (EDL) P16-25. S.M.B. is supported by the European Union (grant agreement 7202070 ‘Human Brain Project’). The authors are grateful to H. Corporaal for reading the manuscript and providing constructive remarks.

Author information

Authors and Affiliations



B.Y., F.C. and S.M.B. conceived the experiments. B.Y. conducted the experiments. B.Y., F.C. and S.M.B. analysed the results. All authors reviewed the manuscript.

Corresponding author

Correspondence to Bojian Yin.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Machine Intelligence thanks Catherine Schuman and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary information

Supplementary Figs. 1–6, Tables 1–5 and Appendix A for FPTT theory.

Reporting Summary

Source data

Source Data Fig. 2

Statistical source data for Fig. 2b,c.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and Permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yin, B., Corradi, F. & Bohté, S.M. Accurate online training of dynamical spiking neural networks through Forward Propagation Through Time. Nat Mach Intell 5, 518–527 (2023).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:

This article is cited by


Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing