Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Optoelectronic graded neurons for bioinspired in-sensor motion perception

Abstract

Motion processing has proven to be a computational challenge and demands considerable computational resources. Contrast this with the fact that flying insects can agilely perceive real-world motion with their tiny vision system. Here we show that phototransistor arrays can directly perceive different types of motion at sensory terminals, emulating the non-spiking graded neurons of insect vision systems. The charge dynamics of the shallow trapping centres in MoS2 phototransistors mimic the characteristics of graded neurons, showing an information transmission rate of 1,200 bit s−1 and effectively encoding temporal light information. We used a 20 × 20 photosensor array to detect trajectories in the visual field, allowing the efficient perception of the direction and vision saliency of moving objects and achieving 99.2% recognition accuracy with a four-layer neural network. By modulating the charge dynamics of the shallow trapping centres of MoS2, the sensor array can recognize motion with a temporal resolution ranging from 101 to 106 ms.

This is a preview of subscription content, access via your institution

Access options

Rent or buy this article

Prices vary by article type

from$1.95

to$39.95

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Agile motion perception of the insect visual system.
Fig. 2: Artificially graded neurons for encoding temporal vision.
Fig. 3: In-sensor motion perception with an optoelectronic graded neuron array.
Fig. 4: Action recognition based on bioinspired vision sensors and conventional image sensors.

Similar content being viewed by others

Data availability

The datasets are available from the corresponding authors upon reasonable request. Source data are provided with this paper.

Code availability

The codes used for neuromorphic computing are available from the corresponding authors upon reasonable request.

References

  1. Mennel, L. et al. Ultrafast machine vision with 2D material neural network image sensors. Nature 579, 62–66 (2020).

    Article  CAS  Google Scholar 

  2. Jang, H. et al. In-sensor optoelectronic computing using electrostatically doped silicon. Nat. Electron. 5, 519–525 (2022).

    Article  Google Scholar 

  3. Chai, Y. In-sensor computing for machine vision. Nature 579, 32–33 (2020).

    Article  CAS  Google Scholar 

  4. Choi, C. et al. Curved neuromorphic image sensor array using a MoS2–organic heterostructure inspired by the human visual recognition system. Nat. Commun. 11, 5934 (2020).

    Article  CAS  Google Scholar 

  5. Zhou, F. et al. Optoelectronic resistive random access memory for neuromorphic vision sensors. Nat. Nanotechnol. 14, 776–782 (2019).

    Article  CAS  Google Scholar 

  6. Seung, H. et al. Integration of synaptic phototransistors and quantum dot light-emitting diodes for visualization and recognition of UV patterns. Sci. Adv. 8, eabq3101 (2022).

    Article  CAS  Google Scholar 

  7. Jayachandran, D. et al. A low-power biomimetic collision detector based on an in-memory molybdenum disulfide photodetector. Nat. Electron. 3, 646–655 (2020).

    Article  Google Scholar 

  8. Chai, Y. Silicon photodiodes that multiply. Nat. Electron. 5, 483–484 (2022).

    Article  Google Scholar 

  9. Zhou, F. & Chai, Y. Near-sensor and in-sensor computing. Nat. Electron. 3, 664–671 (2020).

    Article  Google Scholar 

  10. Li, X. et al. Power-efficient neural network with artificial dendrites. Nat. Nanotechnol. 15, 776–782 (2020).

    Article  CAS  Google Scholar 

  11. Wan, T. et al. In-sensor computing: materials, devices, and integration technologies. Adv. Mater. 9, 2203830 (2022).

    Article  Google Scholar 

  12. Kim, M. et al. An aquatic-vision-inspired camera based on a monocentric lens and a silicon nanorod photodiode array. Nat. Electron. 3, 546–553 (2020).

    Article  Google Scholar 

  13. Simonyan, K. & Zisserman, A. Two-stream convolutional networks for action recognition in videos. Adv. Neural Inf. Process Syst. 27, 568–576 (2014).

    Google Scholar 

  14. Ye, H. et al. Evaluating two-stream CNN for video classification. In Proceedings of the 5th ACM on International Conference on Multimedia Retrieval 435–442 (Association for Computing Machinery, 2015).

  15. Liao, F. et al. Bioinspired in-sensor visual adaptation for accurate perception. Nat. Electron. 5, 84–91 (2022).

    Article  Google Scholar 

  16. Jung, D. et al. Highly conductive and elastic nanomembrane for skin electronics. Science 373, 1022–1026 (2021).

    Article  CAS  Google Scholar 

  17. Song, Y. M. et al. Digital cameras with designs inspired by the arthropod eye. Nature 497, 95–99 (2013).

    Article  CAS  Google Scholar 

  18. Lee, M. et al. An amphibious artificial vision system with a panoramic visual field. Nat. Electron. 5, 452–459 (2022).

    Article  Google Scholar 

  19. Ayers, J., Davis, J. L. & Rudolph, A. Neurotechnology for Biomimetic Robots (MIT Press, 2002).

  20. Webb, B. Robots with insect brains. Science 368, 244–245 (2020).

    Article  CAS  Google Scholar 

  21. de Ruyter van Steveninck, R. & Laughlin, S. The rate of information transfer at graded-potential synapses. Nature 379, 642–645 (1996).

    Article  Google Scholar 

  22. Tuthill, J. C., Nern, A., Holtz, S. L., Rubin, G. M. & Reiser, M. B. Contributions of the 12 neuron classes in the fly lamina to motion vision. Neuron 79, 128–140 (2013).

    Article  CAS  Google Scholar 

  23. Zheng, L. et al. Network adaptation improves temporal representation of naturalistic stimuli in Drosophila eye: I dynamics. PLoS ONE 4, e4307 (2009).

    Article  Google Scholar 

  24. Miall, R. The flicker fusion frequencies of six laboratory insects, and the response of the compound eye to mains fluorescent ‘ripple’. Physiol. Entomol. 3, 99–106 (1978).

    Article  Google Scholar 

  25. Kelly, D. & Wilson, H. Human flicker sensitivity: two stages of retinal diffusion. Science 202, 896–899 (1978).

    Article  CAS  Google Scholar 

  26. Uusitalo, R. & Weckstrom, M. Potentiation in the first visual synapse of the fly compound eye. J. Neurophysiol. 83, 2103–2112 (2000).

    Article  CAS  Google Scholar 

  27. Nikolaev, A. et al. Network adaptation improves temporal representation of naturalistic stimuli in Drosophila eye: II mechanisms. PLoS ONE 4, e4306 (2009).

    Article  Google Scholar 

  28. Hu, W., Wang, T., Wang, X. & Han, J. Ih channels control feedback regulation from amacrine cells to photoreceptors. PLoS Biol. 13, e1002115 (2015).

    Article  Google Scholar 

  29. Laughlin, S. B., de Ruyter van Steveninck, R. R. & Anderson, J. C. The metabolic cost of neural information. Nat. Neurosci. 1, 36–41 (1998).

    Article  CAS  Google Scholar 

  30. Juusola, M., French, A. S., Uusitalo, R. O. & Weckström, M. Information processing by graded-potential transmission through tonically active synapses. Trends Neurosci. 19, 292–297 (1996).

    Article  CAS  Google Scholar 

  31. Schuetzenberger, A. & Borst, A. Seeing natural images through the eye of a fly with remote focusing two-photon microscopy. Iscience 23, 101170 (2020).

    Article  CAS  Google Scholar 

  32. Liu, K. et al. An optoelectronic synapse based on α-In2Se3 with controllable temporal dynamics for multimode and multiscale reservoir computing. Nat. Electron. 5, 761–773 (2022).

    Article  CAS  Google Scholar 

  33. Warland, D., Landolfa, M., Miller, J. P. & Bialek, W. in Analysis and Modeling of Neural Systems (ed Eeckman, F. H.) 327–333 (Springer, 1992).

  34. Jiang, J. et al. Defect engineering for modulating the trap states in 2D photoconductors. Adv. Mater. 30, 1804332 (2018).

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the Research Grant Council of Hong Kong (CRS_PolyU502/22), the Shenzhen Science and Technology Innovation Commission (SGDX2020110309540000), the Innovation Technology Fund (ITS/047/20) and The Hong Kong Polytechnic University (1-ZE1T and CD42). J.-H.A. acknowledges support from the National Research Foundation of Korea (NRF-2015R1A3A2066337).

Author information

Authors and Affiliations

Authors

Contributions

Y.C. conceived the concept and supervised the project. J.C. designed the test protocol and performed the experiments. B.J.K. and J.-H.A. fabricated the devices. J.C., Z.W., T.W. and J.Y. analysed the experimental data. Z.Z., Z.W. and Y.Z. performed the simulations. J.C., Z.Z. and Y.C. co-wrote the paper. All the authors discussed the results and commented on the manuscript.

Corresponding authors

Correspondence to Jong-Hyun Ahn or Yang Chai.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Nanotechnology thanks Feng Miao, Guozhen Shen and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Supplementary Notes I–XI, Figs. 1–20, Table 1 and references.

Supplementary Video 1

Raw video of typical actions from the moving balls’ database.

Supplementary Video 2

Time evolution of a left-moving ball and the corresponding neuromorphic recognition.

Source data

Source Data Fig. 1

Schematic figures.

Source Data Fig. 2

Statistical source data.

Source Data Fig. 3

Statistical source data.

Source Data Fig. 4

Simulation results.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, J., Zhou, Z., Kim, B.J. et al. Optoelectronic graded neurons for bioinspired in-sensor motion perception. Nat. Nanotechnol. 18, 882–888 (2023). https://doi.org/10.1038/s41565-023-01379-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41565-023-01379-2

This article is cited by

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing