Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Bioinspired in-sensor spectral adaptation for perceiving spectrally distinctive features

Abstract

In challenging lighting conditions, machine vision often yields low-quality results. In situations where particular spectral signatures carry critical information, adapting the spectral sensitivity of visions systems to match the predominant spectra of the surrounding environment can improve light capture and image quality. Here we report spectra-adapted vision sensors based on arrays of back-to-back photodiodes. The spectral sensitivity of these bioinspired sensors can be tuned to match either the broadband visible spectrum or a narrow band within the near-infrared spectrum by applying different bias voltages. The process of spectral adaptation takes tens of microseconds, which is comparable with the frame rate (around 100 kHz) of state-of-the-art high-speed cameras. The spectral adaptation increases the Weber contrast of the scene by over ten times, resulting in increased recognition accuracy (from 33% to 90%) of features when exposed to intense visible-light glare.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Spectral adaptation behaviour of Pacific salmons.
Fig. 2: Bias-switchable in-sensor spectral adaptation.
Fig. 3: Dynamic in-sensor spectral adaptation characteristics of the bias-switchable vision sensor array.
Fig. 4: Spectra-adapted vision sensor for imaging and classifying spectrally distinctive features.

Similar content being viewed by others

Data availability

Source data are provided with this paper. Other data related to this study are available from the corresponding author upon reasonable request.

Code availability

The code used in this study is available via GitHub at https://github.com/Jialiang-AP-WANG/ADBPD_simulation/.

References

  1. Härer, A., Meyer, A. & Torres-Dowdall, J. Convergent phenotypic evolution of the visual system via different molecular routes: how Neotropical cichlid fishes adapt to novel light environments. Evol. Lett. 2, 341–354 (2018).

    Article  Google Scholar 

  2. Al Naboulsi, M., Sizun, H. & de Fornel, F. Fog attenuation prediction for optical and infrared waves. Opt. Eng. 43, 319–329 (2004).

    Article  Google Scholar 

  3. Zang, S. Z. et al. The impact of adverse weather conditions on autonomous vehicles: how rain, snow, fog, and hail affect the performance of a self-driving car. IEEE Veh. Technol. Mag. 14, 103–111 (2019).

    Article  Google Scholar 

  4. Panetta, K., Gao, C. & Agaian, S. Human-visual-system-inspired underwater image quality measures. IEEE J. Ocean. Eng. 41, 541–551 (2016).

    Article  Google Scholar 

  5. Gao, X. C. et al. Removing light interference to improve character recognition rate by using single-pixel imaging. Opt. Lasers Eng. 140, 106517 (2021).

    Article  Google Scholar 

  6. Pi, L. J. et al. Broadband convolutional processing using band-alignment-tunable heterostructures. Nat. Electron. 5, 248–254 (2022).

    Article  Google Scholar 

  7. Lee, S., Peng, R., Wu, C. & Li, M. Programmable black phosphorus image sensor for broadband optoelectronic edge computing. Nat. Commun. 13, 1485 (2022).

    Article  Google Scholar 

  8. Hwang, A. et al. Visible and infrared dual-band imaging via Ge/MoS2 van der Waals heterostructure. Sci. Adv. 7, eabj2521 (2021).

    Article  Google Scholar 

  9. Shraddha, C., Chayadevi, M. L. & Anusuya, M. A. Noise cancellation and noise reduction techniques: a review. In 2019 1st International Conference on Advances in Information Technology 159–166 (IEEE, 2019).

  10. Almalioglu, Y., Turan, M., Trigoni, N. & Markham, A. Deep learning-based robust positioning for all-weather autonomous driving. Nat. Mach. Intell. 4, 749–760 (2022).

    Article  Google Scholar 

  11. Verma, G. & Kumar, M. Under-water image enhancement algorithms: a review. AIP Conf. Proc. 2721, 040031 (2023).

  12. Laiho, M., Poikonen, J. & Paasio, A. Focal-Plane Sensor-Processor Chips (Springer, 2011).

  13. Chai, Y. In-sensor computing for machine vision. Nature 579, 32–33 (2020).

    Article  Google Scholar 

  14. Zhou, F. C. & Chai, Y. Near-sensor and in-sensor computing. Nat. Electron. 3, 664–671 (2020).

    Article  Google Scholar 

  15. Zhou, F. et al. Optoelectronic resistive random access memory for neuromorphic vision sensors. Nat. Nanotechnol. 14, 776–782 (2019).

    Article  Google Scholar 

  16. Wan, T., Ma, S., Liao, F., Fan, L. & Chai, Y. Neuromorphic sensory computing. Sci. China Inf. Sci. 65, 141401 (2021).

  17. Liao, F. Y. et al. Bioinspired in-sensor visual adaptation for accurate perception. Nat. Electron. 5, 84–91 (2022).

    Article  Google Scholar 

  18. Nascimento, A. M. et al. A systematic literature review about the impact of artificial intelligence on autonomous vehicle safety. IEEE Trans. Intell. Transp. Syst. 21, 4928–4946 (2020).

    Article  Google Scholar 

  19. Esteva, A. et al. A guide to deep learning in healthcare. Nat. Med. 25, 24–29 (2019).

    Article  Google Scholar 

  20. Oike, Y. Expanding human potential through imaging and sensing technologies. In 2022 International Electron Devices Meeting (IEDM) 1.2.1–1.2.5 (IEEE, 2022).

  21. Baytamouny, M., Kolandaisamy, R. & ALDharhani, G. S. AI-based home security system with face recognition. In 2022 6th International Conference on Trends in Electronics and Informatics (ICOEI) 1038–1042 (IEEE, 2022).

  22. Corbo, J. C. Vitamin A1/A2 chromophore exchange: its role in spectral tuning and visual plasticity. Dev. Biol. 475, 145–155 (2021).

    Article  Google Scholar 

  23. Enright, J. M. et al. Cyp27c1 red-shifts the spectral sensitivity of photoreceptors by converting vitamin A1 into A2. Curr. Biol. 25, 3048–3057 (2015).

    Article  Google Scholar 

  24. Beatty, D. D. A study of the succession of visual pigments in Pacific salmon (Oncorhynchus). Can. J. Zool. 44, 429–455 (1966).

    Article  Google Scholar 

  25. Chen, X. W. et al. Turbidity compensation method based on Mie scattering theory for water chemical oxygen demand determination by UV-vis spectrometry. Anal. Bioanal. Chem. 413, 877–883 (2021).

    Article  Google Scholar 

  26. Peli, E. Contrast in complex images. J. Opt. Soc. Am. A 7, 2032–2040 (1990).

    Article  Google Scholar 

  27. Jang, H. et al. In-sensor optoelectronic computing using electrostatically doped silicon. Nat. Electron. 5, 519–525 (2022).

    Article  Google Scholar 

  28. Pan, C., Zhai, J. & Wang, Z. L. Piezotronics and piezo-phototronics of third generation semiconductor nanowires. Chem. Rev. 119, 9303–9359 (2019).

    Article  Google Scholar 

  29. Xu, Y. et al. Chalcogenide‐based narrowband photodetectors for imaging and light communication. Adv. Funct. Mater. 33, 2212523 (2022).

  30. Fang, Y. J., Dong, Q. F., Shao, Y. C., Yuan, Y. B. & Huang, J. S. Highly narrowband perovskite single-crystal photodetectors enabled by surface-charge recombination. Nat. Photon. 9, 679–686 (2015).

    Article  Google Scholar 

  31. Tang, X., Ackerman, M. M., Chen, M. L. & Guyot-Sionnest, P. Dual-band infrared imaging using stacked colloidal quantum dot photodiodes. Nat. Photon. 13, 277–282 (2019).

    Article  Google Scholar 

  32. Xie, B. et al. Self-filtering narrowband high performance organic photodetectors enabled by manipulating localized Frenkel exciton dissociation. Nat. Commun. 11, 2871 (2020).

    Article  Google Scholar 

  33. Blair, S. et al. Hexachromatic bioinspired camera for image-guided cancer surgery. Sci. Transl. Med. 13, eaaw7067 (2021).

    Article  Google Scholar 

  34. Blair, S. et al. Decoupling channel count from field of view and spatial resolution in single-sensor imaging systems for fluorescence image-guided surgery. J. Biomed. Opt. 27, 096006 (2022).

    Article  Google Scholar 

  35. Yoon, H. H. et al. Miniaturized spectrometers with a tunable van der Waals junction. Science 378, 296–299 (2022).

    Article  Google Scholar 

  36. Chen, J. et al. Optoelectronic graded neurons for bioinspired in-sensor motion perception. Nat. Nanotechnol. 18, 882–888 (2023).

    Article  Google Scholar 

  37. Li, K. et al. Filter-free self-power CdSe/Sb2(S1−x,Sex)3 nearinfrared narrowband detection and imaging. InfoMat 3, 1145–1153 (2021).

    Article  Google Scholar 

  38. Chai, Y. Silicon photodiodes that multiply. Nat. Electron. 5, 483–484 (2022).

    Article  Google Scholar 

  39. IDT MotionPro® Y-Series Digital Cameras (2023); https://www.delimaging.com/camera/idt-motionpro-y-series-compact-digital-cameras/

  40. Kwak, D., Polyushkin, D. K. & Mueller, T. In-sensor computing using a MoS2 photodetector with programmable spectral responsivity. Nat. Commun. 14, 4264 (2023).

    Article  Google Scholar 

  41. Bullough, J. D., Van Derlofske, J., Fay, C. R. & Dee, P. Discomfort glare from headlamps: interactions among spectrum, control of gaze and background light level. SAE Tech. Pap. 2003-2001-0296 (2003).

  42. Hu, J. B., Guo, Y. P., Wang, R. H., Ma, S. & Yu, A. L. Study on the influence of opposing glare from vehicle high-beam headlights based on drivers’ visual requirements. Int. J. Environ. Res. Public. Health 19, 2766 (2022).

    Article  Google Scholar 

  43. Bloj, M. & Hedrich, M. In Handbook of Visual Display Technology Ch. 15 (Springer, 2012).

  44. Jongejan, J., Rowley, H., Kawashima, T., Kim, J. & Fox-Gieg, N. The Quick, Draw! Dataset (2016); https://quickdraw.withgoogle.com/

  45. Yann, L., Corinna, C. & Christopher, J. C. B. The MNIST Database (1998); http://yann.lecun.com/exdb/mnist/

Download references

Acknowledgements

This work is supported by MOST National Key Technologies R&D Programme (SQ2022YFA1200118-04), Research Grant Council of Hong Kong (CRS_PolyU502/22) and The Hong Kong Polytechnic University (1-ZE1T, YXBA and WZ4X).

Author information

Authors and Affiliations

Authors

Contributions

Y.C. conceived the concept and supervised the project. B.O. fabricated the devices. B.O. and J.W. designed the test protocol and performed the experiments. Y.C., B.O. and J.W. analysed the experimental data. G.Z. performed the technology computer-aided design simulation. J.W. performed the simulation of the artificial neural networks. B.O., J.W. and Y.C. co-wrote the paper. All the authors discussed the results and commented on the manuscript.

Corresponding author

Correspondence to Yang Chai.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Electronics thanks Weida Hu, and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Supplementary Figs. 1–21, Notes I–V and references.

Source data

Source Data Fig. 2

Raw data used to plot the data in Fig. 2.

Source Data Fig. 3

Raw data used to plot the data in Fig. 3.

Source Data Fig. 4

Raw data used to plot the data in Fig. 4.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ouyang, B., Wang, J., Zeng, G. et al. Bioinspired in-sensor spectral adaptation for perceiving spectrally distinctive features. Nat Electron 7, 705–713 (2024). https://doi.org/10.1038/s41928-024-01208-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41928-024-01208-x

This article is cited by

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing