Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Computational event-driven vision sensors for in-sensor spiking neural networks

Abstract

Neuromorphic event-based image sensors capture only the dynamic motion in a scene, which is then transferred to computation units for motion recognition. This approach, however, leads to time latency and can be power consuming. Here we report computational event-driven vision sensors that capture and directly convert dynamic motion into programmable, sparse and informative spiking signals. The sensors can be used to form a spiking neural network for motion recognition. Each individual vision sensor consists of two parallel photodiodes with opposite polarities and has a temporal resolution of 5 μs. In response to changes in light intensity, the sensors generate spiking signals with different amplitudes and polarities by electrically programming their individual photoresponsivity. The non-volatile and multilevel photoresponsivity of the vision sensors can emulate synaptic weights and can be used to create an in-sensor spiking neural network. Our computational event-driven vision sensor approach eliminates redundant data during the sensing process, as well as the need for data transfer between sensors and computation units.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Event-driven in-sensor spiking neural network.
Fig. 2: Non-volatile and programmable photoresponsivity of the WSe2 photodiode.
Fig. 3: Tunable event-driven characteristics in the unit based on the WSe2 photodiode.
Fig. 4: In-sensor SNN for motion recognition.

Similar content being viewed by others

Data availability

The data that support the plots within this paper and other findings of this study are available from the corresponding authors upon reasonable request. Source data are provided with this paper.

Code availability

The codes used for simulation and data plotting are available from the corresponding authors upon reasonable request.

References

  1. Lichtsteiner, P. & Delbruck, T. A 64x64 AER logarithmic temporal derivative silicon retina. In Research in Microelectronics and Electronics, 2005 PhD, Vol. 2 202–205 (IEEE, 2005).

  2. Lichtsteiner, P., Posch, C. & Delbruck, T. A 128 × 128 120 dB 15 μs latency asynchronous temporal contrast vision sensor. IEEE J. Solid-State Circuits 43, 566–576 (2008).

    Google Scholar 

  3. Posch, C., Serrano-Gotarredona, T., Linares-Barranco, B. & Delbruck, T. Retinomorphic event-based vision sensors: bioinspired cameras with spiking output. Proc. IEEE 102, 1470–1484 (2014).

    Google Scholar 

  4. Furber, S. B. et al. Overview of the SpiNNaker system architecture. IEEE Trans. Comput. 62, 2454–2467 (2013).

    MathSciNet  Google Scholar 

  5. Akopyan, F. et al. TrueNorth: design and tool flow of a 65 mW 1 million neuron programmable neurosynaptic chip. IEEE Trans. Comput. Aided Des. Integr. Circuits Syst. 34, 1537–1557 (2015).

    Google Scholar 

  6. Davies, M. et al. Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 38, 82–99 (2018).

    Google Scholar 

  7. Bichler, O. et al. Visual pattern extraction using energy-efficient ‘2-PCM synapse’ neuromorphic architecture. IEEE Trans. Electron Devices 59, 2206–2214 (2012).

    Google Scholar 

  8. Bichler, O., Querlioz, D., Thorpe, S. J., Bourgoin, J.-P. & Gamrat, C. Extraction of temporally correlated features from dynamic vision sensors with spike-timing-dependent plasticity. Neural Netw. 32, 339–348 (2012).

    Google Scholar 

  9. Gallego, G. et al. Event-based vision: a survey. IEEE Trans. Pattern Anal. Mach. Intell. 44, 154–180 (2022).

    Google Scholar 

  10. Chen, G. et al. Event-based neuromorphic vision for autonomous driving: a paradigm shift for bio-inspired visual sensing and perception. IEEE Signal Process. Mag. 37, 34–49 (2020).

    Google Scholar 

  11. Drazen, D., Lichtsteiner, P., Häfliger, P., Delbrück, T. & Jensen, A. Toward real-time particle tracking using an event-based dynamic vision sensor. Exp. Fluids 51, 1465–1469 (2011).

    Google Scholar 

  12. Bing, Z., Meschede, C., Chen, G., Knoll, A. & Huang, K. Indirect and direct training of spiking neural networks for end-to-end control of a lane-keeping vehicle. Neural Netw. 121, 21–36 (2020).

    Google Scholar 

  13. Zhou, F. & Chai, Y. Near-sensor and in-sensor computing. Nat. Electron. 3, 664–671 (2020).

    Google Scholar 

  14. Chai, Y. In-sensor computing for machine vision. Nature 579, 32–33 (2020).

  15. Wan, T. et al. In-sensor computing: materials, devices, and integration technologies. Adv. Mater. 35, e2203830 (2023).

    Google Scholar 

  16. Wu, P. et al. Next-generation machine vision systems incorporating two-dimensional materials: progress and perspectives. InfoMat 4, e12275 (2022).

    Google Scholar 

  17. Zhou, F. et al. Optoelectronic resistive random access memory for neuromorphic vision sensors. Nat. Nanotechnol. 14, 776–782 (2019).

    Google Scholar 

  18. Liao, F. et al. Bioinspired in-sensor visual adaptation for accurate perception. Nat. Electron. 5, 1–8 (2022).

    Google Scholar 

  19. Zhang, Z. et al. All-in-one two-dimensional retinomorphic hardware device for motion detection and recognition. Nat. Nanotechnol. 17, 27–32 (2022).

    Google Scholar 

  20. Subbulakshmi Radhakrishnan, S., Dodda, A. & Das, S. An all-in-one bioinspired neural network. ACS Nano 16, 20100–20115 (2022).

    Google Scholar 

  21. Dodda, A., Trainor, N., Redwing, J. M. & Das, S. All-in-one, bio-inspired, and low-power crypto engines for near-sensor security based on two-dimensional memtransistors. Nat. Commun. 13, 3587 (2022).

    Google Scholar 

  22. Wang, S. et al. Networking retinomorphic sensor with memristive crossbar for brain-inspired visual perception. Natl Sci. Rev. 8, nwaa172 (2021).

    Google Scholar 

  23. Jang, H. et al. An atomically thin optoelectronic machine vision processor. Adv. Mater. 32, e2002431 (2020).

    Google Scholar 

  24. Mennel, L. et al. Ultrafast machine vision with 2D material neural network image sensors. Nature 579, 62–66 (2020).

    Google Scholar 

  25. Jang, H. et al. In-sensor optoelectronic computing using electrostatically doped silicon. Nat. Electron. 5, 519–525 (2022).

    Google Scholar 

  26. Cui, B. et al. Ferroelectric photosensor network: an advanced hardware solution to real-time machine vision. Nat. Commun. 13, 1707 (2022).

    Google Scholar 

  27. Wang, C.-Y. et al. Gate-tunable van der Waals heterostructure for reconfigurable neural network vision sensor. Sci. Adv. 6, eaba6173 (2020).

    Google Scholar 

  28. Chai, Y. Silicon photodiodes that multiply. Nat. Electron. 5, 483–484 (2022).

    Google Scholar 

  29. Seung, H. et al. Integration of synaptic phototransistors and quantum dot light-emitting diodes for visualization and recognition of UV patterns. Sci. Adv. 8, eabq3101 (2022).

    Google Scholar 

  30. Zhou, Y. et al. A 2T2R1C vision cell with 140 dB dynamic range and event-driven characteristics for in-sensor spiking neural network. In International Electron Devices Meeting 31–34 (IEEE, 2022).

  31. Koppens, F. H. L. et al. Photodetectors based on graphene, other two-dimensional materials and hybrid systems. Nat. Nanotechnol. 9, 780–793 (2014).

    Google Scholar 

  32. Chen, P. et al. Approaching the intrinsic exciton physics limit in two-dimensional semiconductor diodes. Nature 599, 404–410 (2021).

    Google Scholar 

  33. Zhou, Y. et al. A reconfigurable two-WSe2-transistor synaptic cell for reinforcement learning. Adv. Mater. 34, 2107754 (2022).

    Google Scholar 

  34. Pan, C. et al. Reconfigurable logic and neuromorphic circuits based on electrically tunable two-dimensional homojunctions. Nat. Electron. 3, 383–390 (2020).

    Google Scholar 

  35. Chen, H. et al. Logic gates based on neuristors made from two-dimensional materials. Nat. Electron. 4, 399–404 (2021).

    Google Scholar 

  36. Zhou, C. et al. Carrier type control of WSe2 field-effect transistors by thickness modulation and MoO3 layer doping. Adv. Funct. Mater. 26, 4223–4230 (2016).

    Google Scholar 

  37. Das, S. & Appenzeller, J. WSe2 field effect transistors with enhanced ambipolar characteristics. Appl. Phys. Lett. 103, 103501 (2013).

    Google Scholar 

  38. Suh, D. C. et al. Improved thermal stability of Al2O3/HfO2/Al2O3 high-k gate dielectric stack on GaAs. Appl. Phys. Lett. 96, 142112 (2010).

    Google Scholar 

  39. Lee, D. U., Lee, H. J., Kim, E. K., You, H.-W. & Cho, W.-J. Low operation voltage and high thermal stability of a WSi2 nanocrystal memory device using an Al2O3/HfO2/Al2O3 tunnel layer. Appl. Phys. Lett. 100, 072901 (2012).

    Google Scholar 

  40. Molas, G. et al. Reliability of charge trapping memories with high-k control dielectrics. Microelectron. Eng. 86, 1796–1803 (2009).

    Google Scholar 

  41. Wu, J. Y., Chen, Y. T., Lin, M. H. & Wu, T. B. Ultrathin HfON trapping layer for charge-trap memory made by atomic layer deposition. IEEE Electron Device Lett. 31, 993–995 (2010).

    Google Scholar 

  42. Groenendijk, D. J. et al. Photovoltaic and photothermoelectric effect in a double-gated WSe2 device. Nano Lett. 14, 5846–5852 (2014).

    Google Scholar 

  43. Li, D. et al. Two-dimensional non-volatile programmable p–n junctions. Nat. Nanotechnol. 12, 901–906 (2017).

    Google Scholar 

  44. Buscema, M. et al. Photocurrent generation with two-dimensional van der Waals semiconductors. Chem. Soc. Rev. 44, 3691–3718 (2015).

    Google Scholar 

  45. Baugher, B. W. H., Churchill, H. O. H., Yang, Y. & Jarillo-Herrero, P. Optoelectronic devices based on electrically tunable p–n diodes in a monolayer dichalcogenide. Nat. Nanotechnol. 9, 262–267 (2014).

    Google Scholar 

  46. Prezioso, M. et al. Training and operation of an integrated neuromorphic network based on metal-oxide memristors. Nature 521, 61–64 (2015).

    Google Scholar 

  47. Yao, P. et al. Fully hardware-implemented memristor convolutional neural network. Nature 577, 641–646 (2020).

    Google Scholar 

Download references

Acknowledgements

This work is supported by the Research Grant Council of Hong Kong (CRS_PolyU502/22), the Shenzhen Science and Technology Innovation Commission (SGDX2020110309540000), the Innovation Technology Fund (ITS/047/20) and The Hong Kong Polytechnic University (1-ZE1T, 9BFT and WZ4X). Y.H. acknowledges the support from the National Natural Science Foundation of China (62374093, 92164204) and the National Key Research and Development Program of China (SQ2023YFB4500038).

Author information

Authors and Affiliations

Authors

Contributions

Y.C. conceived the concept. Y.H. and Y.C. supervised the project. Y.Z. designed the devices and circuits. Y.Z., Z.C. and H.Y. fabricated the devices. J.Y. performed the Raman and Atomic Force Microscope characterizations. Y.Z. performed the device and circuit measurement. J.F. and Y.H. designed and performed the neural network simulations. Y.Z., Y.W., F.Z., S.M., L.X. and X.M. analysed the data. Y.Z. and Y.C. wrote the paper. All the authors discussed the results and implications, and reviewed the paper.

Corresponding authors

Correspondence to Yuhui He or Yang Chai.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Electronics thanks Du Xiang and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Supplementary Notes I–XVI, Figs. 1–33, Table 1 and References.

Supplementary Video 1

The complete generated spikes in real time.

Supplementary Video 2

The details of the three types of motions.

Source data

Source Data Fig. 2

Raw data used to plot Fig. 2 in CSV format.

Source Data Fig. 3

Raw data used to plot Fig. 3 in CSV format.

Source Data Fig. 4

Raw data used to plot Fig. 4 in CSV format.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, Y., Fu, J., Chen, Z. et al. Computational event-driven vision sensors for in-sensor spiking neural networks. Nat Electron 6, 870–878 (2023). https://doi.org/10.1038/s41928-023-01055-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41928-023-01055-2

This article is cited by

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing