Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • News & Views
  • Published:

SPIKING NEURAL NETWORKS

Sparsity provides a competitive advantage

Neuromorphic chips that use spikes to encode information could provide fast and energy-efficient computing for ubiquitous embedded systems. A bio-plausible spike-timing solution for training spiking neural networks that makes the most of sparsity is implemented on the BrainScaleS-2 hardware platform.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: TTFS encoding and deployment in neuromorphic hardware.

References

  1. Indiveri, G. & Liu, S.-C. Proc. IEEE 103, 1379–1397 (2015).

    Article  Google Scholar 

  2. Davies, M. et al. Proc. IEEE 109, 911–934 (2021).

    Article  Google Scholar 

  3. Frenkel, C., Bol, D. & Indiveri, G. Preprint at https://arxiv.org/abs/2106.01288 (2021).

  4. Murmann, B. & Höfflinger, B. (eds). NANO-CHIPS 2030: On-chip AI for an Efficient Data-driven World (Springer, 2020).

  5. Chicca, E., Stefanini, F., Bartolozzi, C. & Indiveri, G. Proc. IEEE 102, 1367–1388 (2014).

    Article  Google Scholar 

  6. Göltz, J. et al. Nat. Mach. Intell. https://doi.org/10.1038/s42256-021-00388-x (2021).

  7. Rueckauer, B., Lungu, I.-A., Hu, Y., Pfeiffer, M. & Liu, S.-C. Front. Neurosci. 11, 682 (2017).

    Article  Google Scholar 

  8. Davidsol, S. & Furber, S. B. Front. Neurosci. 15, 651141 (2021).

    Article  Google Scholar 

  9. Mostafa, H. IEEE Trans. Neural Netw. Learn. Syst. 29, 3227–3235 (2017).

    Google Scholar 

  10. Kheradpisheh, S. R. & Masquelier, T. Int. J. Neural Syst. 30, 2050027 (2020).

    Article  Google Scholar 

  11. Fourcaud-Trocmé, N., Hansel, D., Van Vreeswijk, C. & Brunel, N. J. Neurosci. 23, 11628–11640 (2003).

    Article  Google Scholar 

  12. Schemmel, J., Billaudelle, S., Dauer, P. & Weis, J. Preprint at https://arxiv.org/abs/2003.11996 (2020).

  13. Thorpe, S., Delorme, A. & Van Rullen, R. Neural Netw. 14, 715–725 (2001).

    Article  Google Scholar 

  14. Frenkel, C., Legat, J.-D. & Bol, D. IEEE International Symposium on Circuits and Systems (ISCAS, 2020).

  15. Indiveri, G. & Sandamirskaya, Y. IEEE Signal Process. Mag. 36, 16–28 (2019).

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Charlotte Frenkel.

Ethics declarations

Competing interests

The author declares no competing interests.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Frenkel, C. Sparsity provides a competitive advantage. Nat Mach Intell 3, 742–743 (2021). https://doi.org/10.1038/s42256-021-00387-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s42256-021-00387-y

Search

Quick links

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics