Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Near-sensor and in-sensor computing


The number of nodes typically used in sensory networks is growing rapidly, leading to large amounts of redundant data being exchanged between sensory terminals and computing units. To efficiently process such large amounts of data, and decrease power consumption, it is necessary to develop approaches to computing that operate close to or inside sensory networks, and that can reduce the redundant data movement between sensing and processing units. Here we examine the concept of near-sensor and in-sensor computing in which computation tasks are moved partly to the sensory terminals. We classify functions into low-level and high-level processing, and discuss the implementation of near-sensor and in-sensor computing for different physical sensing systems. We also analyse the existing challenges in the field and provide possible solutions for the hardware implementation of integrated sensing and processing units using advanced manufacturing technologies.

This is a preview of subscription content, access via your institution

Access options

Rent or buy this article

Prices vary by article type



Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Sensory computing architectures.
Fig. 2: Illustrations of low-level sensory processing architectures and functions.
Fig. 3: Near-sensor and in-sensor high-level sensory processing.
Fig. 4: Integration technologies for near-sensor and in-sensor computing.


  1. Truong, T. P., Le, H. T. & Nguyen, T. T. A reconfigurable hardware platform for low-power wide-area wireless sensor networks. J. Phys. Conf. Ser. 1432, 012068 (2020).

    Google Scholar 

  2. Chai, Y. In-sensor computing for machine vision. Nature 579, 32–33 (2020).

    Google Scholar 

  3. Taherian, F. & Asemani, D. Design and implementation of digital image processing techniques in pulse-domain. In 2010 IEEE Asia Pacific Conference on Circuits and Systems 895–898 (IEEE, 2010).

  4. Kagawa, K. et al. Pulse-domain digital image processing for vision chips employing low-voltage operation in deep-submicrometer technologies. IEEE J. Sel. Top. Quantum Electron. 10, 816–828 (2004).

    Google Scholar 

  5. Wilson, G. & Premson, Y. FPGA implementation of hardware efficient algorithm for image contrast enhancement using Xilinx System Generator. Proc. Technol. 24, 1141–1148 (2016).

    Google Scholar 

  6. Mukherjee, D. & Mukhopadhyay, S. Fast hardware architecture for fixed-point 2D Gaussian filter. Int. J. Electron. Commun. 105, 98–105 (2019).

    Google Scholar 

  7. Zhou, F. et al. Low‐voltage, optoelectronic CH3NH3PbI3−xClx memory with integrated sensing and logic operations. Adv. Funct. Mater. 28, 1800080 (2018).

    Google Scholar 

  8. Maier, P. et al. Electro-photo-sensitive memristor for neuromorphic and arithmetic computing. Phys. Rev. Appl. 5, 054011 (2016).

    Google Scholar 

  9. Kyuma, K. et al. Artificial retinas—fast, versatile image processors. Nature 372, 197–198 (1994).

    Google Scholar 

  10. Delbrück, T. & Mead, C. Analog VLSI Phototransduction by Continuous-time, Adaptive, Logarithmic Photoreceptor Circuits CNS Memo No. 30 (California Institute of Technology, 1996).

  11. Ruedi, P.-F. et al. A 128x128 pixel 120-db dynamic-range vision-sensor chip for image contrast and orientation extraction. IEEE J. Solid-State Circ. 38, 2325–2333 (2003).

    Google Scholar 

  12. Lichtsteiner, P. & Delbruck, T. A 64×64 AER logarithmic temporal derivative silicon retina. Res. Microelectron. Electron. 2, 202–205 (2005).

    Google Scholar 

  13. Cottini, N., Gottardi, M., Massari, N., Passerone, R. & Smilansky, Z. A 33μW 64x64 pixel vision sensor embedding robust dynamic background subtraction for event detection and scene interpretation. IEEE J. Solid-State Circ. 48, 850–863 (2013).

    Google Scholar 

  14. Miao, W., Lin, Q., Zhang, W. & Wu, N.-J. A programmable SIMD vision chip for real-time vision applications. IEEE J. Solid-State Circ. 43, 1470–1479 (2008).

    Google Scholar 

  15. Hasler, P., Smith, P. D., Graham, D., Ellis, R. & Anderson, D. V. Analog floating-gate, on-chip auditory sensing system interfaces. IEEE Sens. J. 5, 1027–1034 (2005).

    Google Scholar 

  16. Ellis, R., Yoo, H., Graham, D. W., Hasler, P. & Anderson, D. V. A continuous-time speech enhancement front-end for microphone inputs. IEEE Int. Symp. Circ. Syst. Proc. 2, II–II (2002).

    Google Scholar 

  17. Wen, B. & Boahen, K. A 360-channel speech preprocessor that emulates the cochlear amplifier. IEEE Int. Solid State Circ. Conf. 2268–2277 (2006).

  18. Lyon, R. F. & Mead, C. An analog electronic cochlea. IEEE Trans. Acoust. 36, 1119–1134 (1988).

    MATH  Google Scholar 

  19. Kucic, M., Hasler, P., Dugger, J. & Anderson, D. Programmable and adaptive analog filters using arrays of floating-gate circuits. Proc. 2001 Conference on Advanced Research in VLSI 148–162 (IEEE, 2001).

  20. Koickal, T. J. et al. Analog VLSI circuit implementation of an adaptive neuromorphic olfaction chip. IEEE Trans. Circuits Syst. I 54, 60–73 (2007).

    Google Scholar 

  21. Jiménez-Fernández, A. et al. A binaural neuromorphic auditory sensor for FPGA: a spike signal processing approach. IEEE Trans. Neural Netw. Learn. Syst. 28, 804–818 (2016).

    Google Scholar 

  22. Lichtsteiner, P., Posch, C. & Delbruck, T. A 128x128 120 dB 15μs latency asynchronous temporal contrast vision sensor. IEEE J. Solid-State Circ. 43, 566–576 (2008).

    Google Scholar 

  23. Choo, K. D. et al. Energy-efficient motion-triggered IoT CMOS image sensor with capacitor array-assisted charge-injection SAR ADC. IEEE J. Solid-State Circuits 54, 2921–2931 (2019).

    Google Scholar 

  24. Finateu, T. et al. A 1280×720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 μm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline. IEEE Int. Solid-State Circ. Conf. 112–114 (2020).

  25. Hsu, T.-H. et al. A 0.8 V multimode vision sensor for motion and saliency detection with ping-pong PWM pixel. IEEE Int. Solid-State Circ. Conf. 110–112 (2020).

  26. Zhou, F. et al. Optoelectronic resistive random access memory for neuromorphic vision sensors. Nat. Nanotechnol. 14, 776–782 (2019).

    Google Scholar 

  27. Wang, C.-Y. et al. Gate-tunable van der Waals heterostructure for reconfigurable neural network vision sensor. Sci. Adv. 6, eaba6173 (2020).

    Google Scholar 

  28. Qin, S. et al. A light-stimulated synaptic device based on graphene hybrid phototransistor. 2D Mater. 4, 035022 (2017).

    Google Scholar 

  29. Zhang, J., Dai, S., Zhao, Y., Zhang, J. & Huang, J. Recent progress in photonic synapses for neuromorphic systems. Adv. Intell. Syst. 2, 1900136 (2020).

    Google Scholar 

  30. Seo, S. et al. Artificial optic-neural synapse for colored and color-mixed pattern recognition. Nat. Commun. 9, 5106 (2018).

    Google Scholar 

  31. Du, Z. et al. ShiDianNao: shifting vision processing closer to the sensor. In Proc. 42nd Annual International Symposium on Computer Architecture (ISCA) 92–104 (ACM and IEEE, 2015).

  32. Chen, T. et al. Diannao: a small-footprint high-throughput accelerator for ubiquitous machine-learning. ACM SIGARCH Computer Architecture News 42, 269–284 (2014).

    Google Scholar 

  33. Hsu, T.-H. et al. AI edge devices using computing-in-memory and processing-in-sensor: from system to device. IEEE Int. Electron Devices Meet. 22.25.21–2225.24 (2019).

  34. Amir, M. et al. NeuroSensor: a 3D image sensor with integrated neural accelerator. In 2016 IEEE SOI-3D-Subthreshold Microelectronics Technology Unified Conference (S3S) 1–2 (IEEE, 2016).

  35. Amir, M. F., Ko, J. H., Na, T., Kim, D. & Mukhopadhyay, S. 3-D stacked image sensor with deep neural network computation. IEEE Sens. J. 18, 4187–4199 (2018).

    Google Scholar 

  36. LiKamWa, R., Hou, Y., Gao, J., Polansky, M. & Zhong, L. RedEye: analog ConvNet image sensor architecture for continuous mobile vision. ACM SIGARCH Computer Architecture News 44, 255–266 (2016).

    Google Scholar 

  37. Li, C. et al. Analogue signal and image processing with large memristor crossbars. Nat. Electron. 1, 52–59 (2018).

    Google Scholar 

  38. Zidan, M. A., Strachan, J. P. & Lu, W. D. The future of electronics based on memristive systems. Nat. Electron. 1, 22–29 (2018).

    Google Scholar 

  39. Chu, M. et al. Neuromorphic hardware system for visual pattern recognition with memristor array and CMOS neuron. IEEE Trans. Ind. Electron. 62, 2410–2419 (2014).

    Google Scholar 

  40. Mennel, L. et al. Ultrafast machine vision with 2D material neural network image sensors. Nature 579, 62–66 (2020).

    Google Scholar 

  41. Hsieh, H.-Y. & Tang, K.-T. VLSI implementation of a bio-inspired olfactory spiking neural network. IEEE Trans. Neural Netw. Learn. Syst. 23, 1065–1073 (2012).

    Google Scholar 

  42. Shulaker, M. M. et al. Three-dimensional integration of nanotechnologies for computing and data storage on a single chip. Nature 547, 74–78 (2017).

    Google Scholar 

  43. Tan, H. et al. Tactile sensory coding and learning with bio-inspired optoelectronic spiking afferent nerves. Nat. Commun. 11, 1369 (2020).

    Google Scholar 

  44. Kim, Y. et al. A bioinspired flexible organic artificial afferent nerve. Science 360, 998–1003 (2018).

    Google Scholar 

  45. Wang, W. et al. Learning of spatiotemporal patterns in a spiking neural network with resistive switching synapses. Sci. Adv. 4, eaat4752 (2018).

    Google Scholar 

  46. Bhansali, S. et al. 3D heterogeneous sensor system on a chip for defense and security applications. Proc. SPIE 5417, 413–424 (2004).

    Google Scholar 

  47. Lie, D., Chae, K. & Mukhopadhyay, S. Analysis of the performance, power, and noise characteristics of a CMOS image sensor with 3-D integrated image compression unit. IEEE Trans. Compon. Packaging. Manuf. Technol. 4, 198–208 (2014).

    Google Scholar 

  48. Zhang, X. et al. Heterogeneous 2.5 D integration on through silicon interposer. Appl. Phys. Rev. 2, 021308 (2015).

    Google Scholar 

  49. Hu, Y.-C. et al. An advanced 2.5-D heterogeneous integration packaging for high-density neural sensing microsystem. IEEE Trans. Electron Devices 64, 1666–1673 (2017).

    Google Scholar 

  50. Wang, M. et al. Gesture recognition using a bioinspired learning architecture that integrates visual data with somatosensory data from stretchable sensors. Nat. Electron. 3, 563–570 (2020).

    Google Scholar 

  51. Yao, P. et al. Fully hardware-implemented memristor convolutional neural network. Nature 577, 641–646 (2020).

    Google Scholar 

  52. Ielmini, D. & Wong, H.-S. P. In-memory computing with resistive switching devices. Nat. Electron. 1, 333–343 (2018).

    Google Scholar 

Download references


This work was supported by Research Grant Council of Hong Kong (15205619) and the Hong Kong Polytechnic University (1-ZVGH and ZG6C).

Author information

Authors and Affiliations



Y.C. conceived the project. F.Z. performed the literature research and prepared the figures. F.Z. and Y.C. carried out comparative analysis and wrote the manuscript.

Corresponding author

Correspondence to Yang Chai.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Peer review information Nature Electronics thanks Feng Miao, Chih-Cheng Hsieh and Thomas Mueller for their contribution to the peer review of this work.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, F., Chai, Y. Near-sensor and in-sensor computing. Nat Electron 3, 664–671 (2020).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:

This article is cited by


Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing