Abstract
Motion processing has proven to be a computational challenge and demands considerable computational resources. Contrast this with the fact that flying insects can agilely perceive real-world motion with their tiny vision system. Here we show that phototransistor arrays can directly perceive different types of motion at sensory terminals, emulating the non-spiking graded neurons of insect vision systems. The charge dynamics of the shallow trapping centres in MoS2 phototransistors mimic the characteristics of graded neurons, showing an information transmission rate of 1,200 bit s−1 and effectively encoding temporal light information. We used a 20 × 20 photosensor array to detect trajectories in the visual field, allowing the efficient perception of the direction and vision saliency of moving objects and achieving 99.2% recognition accuracy with a four-layer neural network. By modulating the charge dynamics of the shallow trapping centres of MoS2, the sensor array can recognize motion with a temporal resolution ranging from 101 to 106 ms.
This is a preview of subscription content, access via your institution
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$29.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 print issues and online access
$259.00 per year
only $21.58 per issue
Rent or buy this article
Get just this article for as long as you need it
$39.95
Prices may be subject to local taxes which are calculated during checkout




Data availability
The datasets are available from the corresponding authors upon reasonable request. Source data are provided with this paper.
Code availability
The codes used for neuromorphic computing are available from the corresponding authors upon reasonable request.
References
Mennel, L. et al. Ultrafast machine vision with 2D material neural network image sensors. Nature 579, 62–66 (2020).
Jang, H. et al. In-sensor optoelectronic computing using electrostatically doped silicon. Nat. Electron. 5, 519–525 (2022).
Chai, Y. In-sensor computing for machine vision. Nature 579, 32–33 (2020).
Choi, C. et al. Curved neuromorphic image sensor array using a MoS2–organic heterostructure inspired by the human visual recognition system. Nat. Commun. 11, 5934 (2020).
Zhou, F. et al. Optoelectronic resistive random access memory for neuromorphic vision sensors. Nat. Nanotechnol. 14, 776–782 (2019).
Seung, H. et al. Integration of synaptic phototransistors and quantum dot light-emitting diodes for visualization and recognition of UV patterns. Sci. Adv. 8, eabq3101 (2022).
Jayachandran, D. et al. A low-power biomimetic collision detector based on an in-memory molybdenum disulfide photodetector. Nat. Electron. 3, 646–655 (2020).
Chai, Y. Silicon photodiodes that multiply. Nat. Electron. 5, 483–484 (2022).
Zhou, F. & Chai, Y. Near-sensor and in-sensor computing. Nat. Electron. 3, 664–671 (2020).
Li, X. et al. Power-efficient neural network with artificial dendrites. Nat. Nanotechnol. 15, 776–782 (2020).
Wan, T. et al. In-sensor computing: materials, devices, and integration technologies. Adv. Mater. 9, 2203830 (2022).
Kim, M. et al. An aquatic-vision-inspired camera based on a monocentric lens and a silicon nanorod photodiode array. Nat. Electron. 3, 546–553 (2020).
Simonyan, K. & Zisserman, A. Two-stream convolutional networks for action recognition in videos. Adv. Neural Inf. Process Syst. 27, 568–576 (2014).
Ye, H. et al. Evaluating two-stream CNN for video classification. In Proceedings of the 5th ACM on International Conference on Multimedia Retrieval 435–442 (Association for Computing Machinery, 2015).
Liao, F. et al. Bioinspired in-sensor visual adaptation for accurate perception. Nat. Electron. 5, 84–91 (2022).
Jung, D. et al. Highly conductive and elastic nanomembrane for skin electronics. Science 373, 1022–1026 (2021).
Song, Y. M. et al. Digital cameras with designs inspired by the arthropod eye. Nature 497, 95–99 (2013).
Lee, M. et al. An amphibious artificial vision system with a panoramic visual field. Nat. Electron. 5, 452–459 (2022).
Ayers, J., Davis, J. L. & Rudolph, A. Neurotechnology for Biomimetic Robots (MIT Press, 2002).
Webb, B. Robots with insect brains. Science 368, 244–245 (2020).
de Ruyter van Steveninck, R. & Laughlin, S. The rate of information transfer at graded-potential synapses. Nature 379, 642–645 (1996).
Tuthill, J. C., Nern, A., Holtz, S. L., Rubin, G. M. & Reiser, M. B. Contributions of the 12 neuron classes in the fly lamina to motion vision. Neuron 79, 128–140 (2013).
Zheng, L. et al. Network adaptation improves temporal representation of naturalistic stimuli in Drosophila eye: I dynamics. PLoS ONE 4, e4307 (2009).
Miall, R. The flicker fusion frequencies of six laboratory insects, and the response of the compound eye to mains fluorescent ‘ripple’. Physiol. Entomol. 3, 99–106 (1978).
Kelly, D. & Wilson, H. Human flicker sensitivity: two stages of retinal diffusion. Science 202, 896–899 (1978).
Uusitalo, R. & Weckstrom, M. Potentiation in the first visual synapse of the fly compound eye. J. Neurophysiol. 83, 2103–2112 (2000).
Nikolaev, A. et al. Network adaptation improves temporal representation of naturalistic stimuli in Drosophila eye: II mechanisms. PLoS ONE 4, e4306 (2009).
Hu, W., Wang, T., Wang, X. & Han, J. Ih channels control feedback regulation from amacrine cells to photoreceptors. PLoS Biol. 13, e1002115 (2015).
Laughlin, S. B., de Ruyter van Steveninck, R. R. & Anderson, J. C. The metabolic cost of neural information. Nat. Neurosci. 1, 36–41 (1998).
Juusola, M., French, A. S., Uusitalo, R. O. & Weckström, M. Information processing by graded-potential transmission through tonically active synapses. Trends Neurosci. 19, 292–297 (1996).
Schuetzenberger, A. & Borst, A. Seeing natural images through the eye of a fly with remote focusing two-photon microscopy. Iscience 23, 101170 (2020).
Liu, K. et al. An optoelectronic synapse based on α-In2Se3 with controllable temporal dynamics for multimode and multiscale reservoir computing. Nat. Electron. 5, 761–773 (2022).
Warland, D., Landolfa, M., Miller, J. P. & Bialek, W. in Analysis and Modeling of Neural Systems (ed Eeckman, F. H.) 327–333 (Springer, 1992).
Jiang, J. et al. Defect engineering for modulating the trap states in 2D photoconductors. Adv. Mater. 30, 1804332 (2018).
Acknowledgements
This work was supported by the Research Grant Council of Hong Kong (CRS_PolyU502/22), the Shenzhen Science and Technology Innovation Commission (SGDX2020110309540000), the Innovation Technology Fund (ITS/047/20) and The Hong Kong Polytechnic University (1-ZE1T and CD42). J.-H.A. acknowledges support from the National Research Foundation of Korea (NRF-2015R1A3A2066337).
Author information
Authors and Affiliations
Contributions
Y.C. conceived the concept and supervised the project. J.C. designed the test protocol and performed the experiments. B.J.K. and J.-H.A. fabricated the devices. J.C., Z.W., T.W. and J.Y. analysed the experimental data. Z.Z., Z.W. and Y.Z. performed the simulations. J.C., Z.Z. and Y.C. co-wrote the paper. All the authors discussed the results and commented on the manuscript.
Corresponding authors
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Nature Nanotechnology thanks Feng Miao, Guozhen Shen and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Supplementary Information
Supplementary Notes I–XI, Figs. 1–20, Table 1 and references.
Supplementary Video 1
Raw video of typical actions from the moving balls’ database.
Supplementary Video 2
Time evolution of a left-moving ball and the corresponding neuromorphic recognition.
Source data
Source Data Fig. 1
Schematic figures.
Source Data Fig. 2
Statistical source data.
Source Data Fig. 3
Statistical source data.
Source Data Fig. 4
Simulation results.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Chen, J., Zhou, Z., Kim, B.J. et al. Optoelectronic graded neurons for bioinspired in-sensor motion perception. Nat. Nanotechnol. (2023). https://doi.org/10.1038/s41565-023-01379-2
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41565-023-01379-2