Detection and tracking of moving objects hidden from view

Journal name:
Nature Photonics
Year published:
Published online

The ability to detect motion and track a moving object hidden around a corner or behind a wall provides a crucial advantage when physically going around the obstacle is impossible or dangerous. Previous methods have demonstrated that it is possible to reconstruct the shape of an object hidden from view. However, these methods do not enable the tracking of movement in real time. We demonstrate a compact non-line-of-sight laser ranging technology that relies on the ability to send light around an obstacle using a scattering floor and then detect the return signal from a hidden object within only a few seconds of acquisition time. By detecting this signal with a single-photon avalanche diode (SPAD) camera, we follow the movement of an object located a metre away from the camera with centimetre precision. We discuss the possibility of applying this technology to a variety of real-life situations in the near future.

At a glance


  1. Looking around a corner.
    Figure 1: Looking around a corner.

    Our setup recreates, at a ∼5× reduced scale, a situation where a person is hidden from view by a wall or an obstacle. a, The camera is positioned on the side of the wall and is looking down at the floor: it cannot see what is behind the wall but its field of view is placed beyond the edge of the obstacle. b, A side view shows that the target is hidden behind the wall. To see the hidden target around the corner, laser pulses are sent to the floor. c, The light then scatters off the floor and propagates as a spherical wave behind the obstacle, reaching the hidden object. This light is then in turn scattered back into the field of view of the camera. The SPAD camera records both spatial and temporal information on the propagating spherical light wave as it passes through the field of view, creating an elliptical pattern where it intercepts the floor. An example of the spatially resolved raw data, as recorded by the camera for a fixed time frame as the ellipse passes in the field of view, is shown in the inset.

  2. Retrieving a hidden object's position.
    Figure 2: Retrieving a hidden object's position.

    a, A histogram of photon arrival times is recorded for every pixel (here for pixel i as indicated in c). This experimental histogram contains signals both from the target and unwanted background sources. b, Background subtraction and data processing allows us to isolate the signal from the target and fit a Gaussian to its peak, centred at with a standard deviation of . c, is used to trace an ellipse of possible positions of the target which would lead to a signal at this time. d, Ellipses calculated from different pixel (experimental data) give slightly displaced probability distributions that intercept at a given point. The area where the ellipses overlap indicates the region of highest probability for the target location. Multiplying these probability distributions (with all other similar distributions from all 1,024 pixels of the camera) provides an estimate of the target location.

  3. Experimental results for position retrieval of the hidden object.
    Figure 3: Experimental results for position retrieval of the hidden object.

    Experimental layout and results showing the retrieved locations for eight distinct positions of the target, approximately one metre away from the camera (distances indicated in the figure are measured from the camera). The coloured areas in the graph indicate the joint probability distribution for the target location whose actual positions are shown by the white rectangles. Each peak value is individually normalized to one.

  4. Non-line-of-sight tracking of a moving target.
    Figure 4: Non-line-of-sight tracking of a moving target.

    Distances in the graph are measured from the camera position. a, The object is moving in a straight line along the y direction, from bottom to top (as represented by the dashed rectangle and the arrow), at a speed of 2.8 cm s–1. The coloured areas represent the retrieved joint probability distributions: the point of highest probability, indicating the estimated target location, is highlighted with a filled circle. The colours correspond to different acquisition ‘start’ times, as indicated in the colourbar: successive measurements are each separated by 3 s intervals, that is, the data acquisition time as explained in the text. b,c, Retrieved positions in x (b) and y (c) as a function of time. The dots in (b,c) show the points of maximum probability together with the 50% confidence bounds (red shaded area). The green area shows the actual position of the target.


  1. Kirmani, A. et al. First-photon imaging. Science 343, 5861 (2014).
  2. Sun, B. et al. 3d computational imaging with single-pixel detectors. Science 340, 844847 (2013).
  3. Wandinger, U. Introduction to Lidar (Springer, 2005).
  4. Massa, J. S., Wallace, A. M., Buller, G. S., Fancey, S. J. & Walker, A. C. Laser depth measurement based on time-correlated single-photon counting. Opt. Lett. 22, 543545 (1997).
  5. Velten, A. et al. Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging. Nature Commun. 3, 745 (2012).
  6. Gao, L., Liang, J., Li, C. & Wang, L. V. Single-shot compressed ultrafast photography at one hundred billion frames per second. Nature 516, 7477 (2015).
  7. Gariepy, G. et al. Single-photon sensitive light-in-flight imaging. Nature Commun. 6, 6021 (2015).
  8. Buttafava, M., Zeman, J., Tosi, A., Eliceiri, K. & Velten, A. Non-line-of-sight imaging using a time-gated single photon avalanche diode. Opt. Express 23, 20997 (2015).
  9. Sume, A. et al. Radar detection of moving objects around corners. Proc. SPIE 7308, 73080V (2009).
  10. Chakraborty, B. et al. in 2010 IEEE Int. Conf. Acoust, Speech Signal Process. 38943897 (2010).
  11. Gupta, O., Willwacher, T., Velten, A., Veeraraghavan, A. & Raskar, R. Reconstruction of hidden 3d shapes using diffuse reflections. Opt. Express 20, 1909619108 (2012).
  12. Repasi, E. et al. Advanced short-wavelength infrared range-gated imaging for ground applications in monostatic and bistatic configurations. Appl. Opt. 48, 59565969 (2009).
  13. Vellekoop, I. M. & Mosk, A. P. Universal optimal transmission of light through disordered materials. Phys. Rev. Lett. 101, 120601 (2008).
  14. Mosk, A. P., Lagendijk, A., Lerosey, G. & Fink, M. Controlling waves in space and time for imaging and focusing in complex media. Nature Photon. 6, 283292 (2012).
  15. Bertolotti, J. et al. Non-invasive imaging through opaque scattering layers. Nature 491, 232234 (2012).
  16. Katz, O., Small, E. & Silberberg, Y. Looking around corners and through thin turbid layers in real time with scattered incoherent light. Nature Photon. 6, 459553 (2012).
  17. Katz, O., Heidmann, P., Fink, M. & Gigan, S. Non-invasive single-shot imaging through scattering layers and around corners via speckle correlations. Nature Photon. 8, 784790 (2014).
  18. Richardson, J. et al. A 32×32 50ps resolution 10 bit time to digital converter array in 130nm CMOS for time correlated imaging. In Custom Integr. Circ. Conf. 2009 7780 (IEEE, 2009).
  19. Richardson, J., Grant, L. & Henderson, R. Low dark count single-photon avalanche diode structure compatible with standard nanometer scale CMOS technology. Photon. Technol. Lett. IEEE 21, 10201022 (2009).
  20. Niclass, C., Rochas, A., Besse, P.-A. & Charbon, E. Design and characterization of a CMOS 3-d image sensor based on single photon avalanche diodes. IEEE J. Solid-State Circuits 40, 18471854 (2005).
  21. O'Connor, D. & Phillips, D. Time-Correlated Single Photon Counting (Academic Press, 1984).
  22. Becker, W. Advanced Time-correlated Single Photon Counting Techniques (Springer, 2005).
  23. Anderson, B. D. & Moore, J. B. Optimal Filtering (Prentice-Hall, 1979).
  24. Cucchiara, R., Grana, C., Piccardi, M. & Prati, A. Detecting moving objects, ghosts, and shadows in video streams. IEEE Trans. Pattern Anal. Machine Intel. 25, 13371342 (2003).
  25. Cutler, R. & Davis, L. in Proc. Int. Conf. Pattern Recog. 1, 495 (IEEE, 1998).
  26. Buller, G. S. & Wallace, A. M. Ranging and three-dimensional imaging using time-correlated single-photon counting and point-by-point acquisition. IEEE J. Sel. Top. Quant. Electron. 13, 10061015 (2007).
  27. Ho, C. et al. Demonstration of literal three-dimensional imaging. Appl. Opt. 38, 18331840 (1999).
  28. Albota, M. A. et al. Three-dimensional imaging laser radar with a photon-counting avalanche photodiode array and microchip laser. Appl. Opt. 41, 76717678 (2002).

Download references

Author information


  1. Institute of Photonics and Quantum Sciences, Heriot-Watt University, David Brewster Building, Edinburgh EH14 4AS, UK

    • Genevieve Gariepy,
    • Francesco Tonolini,
    • Jonathan Leach &
    • Daniele Faccio
  2. Institute for Micro and Nano Systems, University of Edinburgh, Alexander Crum Brown Road, Edinburgh EH9 3FF, UK

    • Robert Henderson


D.F. and J.L. conceived the experiment. R.K.H. designed the CMOS SPAD pixel architecture. G.G.performed the experiment. F.T. developed the tracking algorithm. G.G. and F.T. analysed the data and drafted the manuscript. All authors discussed the data and contributed to the manuscript.

Competing financial interests

The authors declare no competing financial interests.

Corresponding authors

Correspondence to:

Author details

Supplementary information

PDF files

  1. Supplementary information (880 KB)

    Supplementary information


  1. Supplementary information (211 KB)

    Supplementary Movie 1

Additional data