Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Achieving nanoscale precision using neuromorphic localization microscopy

Abstract

Neuromorphic cameras are a new class of dynamic-vision-inspired sensors that encode the rate of change of intensity as events. They can asynchronously record intensity changes as spikes, independent of the other pixels in the receptive field, resulting in sparse measurements. This recording of such sparse events makes them ideal for imaging dynamic processes, such as the stochastic emission of isolated single molecules. Here we show the application of neuromorphic detection to localize nanoscale fluorescent objects below the diffraction limit, with a precision below 20 nm. We demonstrate a combination of neuromorphic detection with segmentation and deep learning approaches to localize and track fluorescent particles below 50 nm with millisecond temporal resolution. Furthermore, we show that combining information from events resulting from the rate of change of intensities improves the classical limit of centroid estimation of single fluorescent objects by nearly a factor of two. Additionally, we validate that using post-processed data from the neuromorphic detector at defined windows of temporal integration allows a better evaluation of the fractalized diffusion of single particle trajectories. Our observations and analysis is useful for event sensing by nonlinear neuromorphic devices to ameliorate real-time particle localization approaches at the nanoscale.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Detection of an isolated subdiffraction-sized single fluorescent particle using neuromorphic imaging.
Fig. 2: Temporal characteristics of emission of an isolated subdiffraction-sized single fluorescent particle using neuromorphic imaging.
Fig. 3: Transformation of cumulative probability density of ON and OFF processes allows localization below the limit of classical single particle detection.
Fig. 4: Comparison of standard deviations of the raw and processed error in fitting the centroid from ON and OFF processes.
Fig. 5: Differential temporal binning of events from the neuromorphic camera allows insights into fractalized Brownian diffusion from individual trajectories.

Similar content being viewed by others

Data availability

An example dataset of neuromorphic events, the reconstructed frames and instructions for use are available for download on Github: https://github.com/neuromorphicmicroscopy/Neuromorphic-Localisation-Microscopy. The rest of the data that support the findings of this study are available as Supplementary Information. Source data are provided with this paper.

Code availability

The work presented here used codes that were already available.

The version of the code used for extraction of data from the neuromorphic camera is available from D.N. and C.S.T. on request. Codes that are central to the conclusions and inferences made in the manuscript regarding the reconstruction of frames, pipelines built using DeepTrack and instructions for use are available for download on Github: https://github.com/neuromorphicmicroscopy/Neuromorphic-Localisation-Microscopy.

PalmTracer is an all-in-one software package for the analysis of single molecule localization microscopy data that can be downloaded from https://neuro-intramuros.u-bordeaux.fr/displayresearchprojects/70/11.

The instructions and codes required for synchronizing the neuromorphic camera with other devices are available on Github: https://inivation.gitlab.io/dv/dv-docs/docs/external-camera-sync/ and https://github.com/uzh-rpg/rpg_dvs_ros.

References

  1. Nyquist, H. Certain topics in telegraph transmission theory. Trans. Am. Inst. Electr. Eng. 47, 617–644 (1928).

    Article  Google Scholar 

  2. Shannon, C. E. Communication in the presence of noise. Proc. IRE 37, 10–21 (1949).

    Article  Google Scholar 

  3. Betzig, E. Proposed method for molecular optical imaging. Opt. Lett. 20, 237–239 (1995).

    Article  CAS  Google Scholar 

  4. Betzig, E. et al. Imaging intracellular fluorescent proteins at nanometer resolution. Science 313, 1642–1645 (2006).

    Article  CAS  Google Scholar 

  5. Moerner, W. E. & Kador, L. Optical detection and spectroscopy of single molecules in a solid. Phys. Rev. Lett. 62, 2535–2538 (1989).

    Article  CAS  Google Scholar 

  6. Moerner, W. E. & Basché, T. Optical spectroscopy of single impurity molecules in solids. Angew. Chem. Int. Ed. 32, 457–476 (1993).

    Article  Google Scholar 

  7. Gahlmann, A. & Moerner, W. E. Exploring bacterial cell biology with single-molecule tracking and super-resolution imaging. Nat. Rev. Microbiol. 12, 9–22 (2014).

    Article  CAS  Google Scholar 

  8. Orrit, M. & Bernard, J. Single pentacene molecules detected by fluorescence excitation in a p-terphenyl crystal. Phys. Rev. Lett. 65, 2716–2719 (1990).

    Article  CAS  Google Scholar 

  9. Lacoste, T. D. et al. Ultrahigh-resolution multicolor colocalization of single fluorescent probes. Proc. Natl Acad. Sci. USA 97, 9461–9466 (2000).

    Article  CAS  Google Scholar 

  10. Thompson, R. E., Larson, D. R. & Webb, W. W. Precise nanometer localization analysis for individual fluorescent probes. Biophys. J. 82, 2775–2783 (2002).

    Article  CAS  Google Scholar 

  11. Shroff, H. et al. Dual-color superresolution imaging of genetically expressed probes within individual adhesion complexes. Proc. Natl Acad. Sci. USA 104, 20308–20313 (2007).

    Article  CAS  Google Scholar 

  12. Gallego, G. et al. Event-based vision: a survey. IEEE Trans. Pattern Anal. Mach. Intell. https://doi.org/10.1109/TPAMI.2020.3008413 (2020).

  13. Posch, C. Bio-inspired vision. J. Instrum. 7, C01054–C01054 (2012).

    Article  Google Scholar 

  14. Miao, S. et al. Neuromorphic vision datasets for pedestrian detection, action recognition, and fall detection. Front. Neurorobot. 13, 38 (2019).

    Article  Google Scholar 

  15. Lichtsteiner, P., Posch, C. & Delbruck, T. A 128 × 128 120 dB 15 μs latency asynchronous temporal contrast vision sensor. IEEE J. Solid-State Circuits 43, 566–576 (2008).

    Article  Google Scholar 

  16. Son, B. et al. 4.1 A 640×480 dynamic vision sensor with a 9µm pixel and 300Meps address-event representation. In Proc. 2017 IEEE International Solid-State Circuits Conference (ISSCC) 66–67 (IEEE, 2017).

  17. Liu, S. C. & Delbruck, T. Neuromorphic sensory systems. Curr. Opin. Neurobiol. 20, 288–295 (2010).

    Article  Google Scholar 

  18. Mead, C. Neuromorphic electronic systems. Proc. IEEE 78, 1629–1636 (1990).

    Article  Google Scholar 

  19. Mangalore, A. R., Seelamantula, C. S. & Thakur, C. S. Neuromorphic fringe projection profilometry. IEEE Signal Process. Lett. 27, 1510–1514 (2020).

    Article  Google Scholar 

  20. Liao, F., Zhou, F. & Chai, Y. Neuromorphic vision sensors: principle, progress and perspectives. J. Semicond. 42, 013105 (2021).

    Article  Google Scholar 

  21. Ham, D., Park, H., Hwang, S. & Kim, K. Neuromorphic electronics based on copying and pasting the brain. Nat. Electron. 4, 635–644 (2021).

    Article  Google Scholar 

  22. Hamilton, T. J., Afshar, S., Schaik, A. V. & Tapson, J. Stochastic electronics: a neuro-inspired design paradigm for integrated circuits. Proc. IEEE 102, 843–859 (2014).

    Article  Google Scholar 

  23. Lakshmi, A., Chakraborty, A. & Thakur, C. S. Neuromorphic vision: from sensors to event-based algorithms. WIREs Data Min. Knowl. Discov. 9, e1310 (2019).

    Google Scholar 

  24. Kedia, S. et al. Real-time nanoscale organization of amyloid precursor protein. Nanoscale 12, 8200–8215 (2020).

    Article  CAS  Google Scholar 

  25. Nair, D. et al. Super-resolution imaging reveals that AMPA receptors inside synapses are dynamically organized in nanodomains regulated by PSD95. J. Neurosci. 33, 13204–13224 (2013).

    Article  CAS  Google Scholar 

  26. Mueggler, E., Rebecq, H., Gallego, G., Delbruck, T. & Scaramuzza, D. The event-camera dataset and simulator: event-based data for pose estimation, visual odometry, and SLAM. Int. J. Robot. Res. 36, 142–149 (2017).

    Article  Google Scholar 

  27. Annamalai, L., Chakraborty, A. & Thakur, C. S. EvAn: neuromorphic event-based sparse anomaly detection. Front. Neurosci. 15, 699003 (2021).

    Article  Google Scholar 

  28. Kechkar, A., Nair, D., Heilemann, M., Choquet, D. & Sibarita, J.-B. Real-time analysis and visualization for single-molecule based super-resolution microscopy. PLoS One 8, e62918 (2013).

    Article  CAS  Google Scholar 

  29. Izeddin, I. et al. Wavelet analysis for single molecule localization microscopy. Opt. Express 20, 2081–2095 (2012).

    Article  CAS  Google Scholar 

  30. Helgadottir, S., Argun, A. & Volpe, G. Digital video microscopy enhanced by deep learning. Optica 6, 506–513 (2019).

    Article  Google Scholar 

  31. Hedde, P. N. miniSPIM—a miniaturized light-sheet microscope. ACS Sens. 6, 2654–2663 (2021).

    Article  CAS  Google Scholar 

  32. Mitchell, M. W., Lundeen, J. S. & Steinberg, A. M. Super-resolving phase measurements with a multiphoton entangled state. Nature 429, 161–164 (2004).

    Article  CAS  Google Scholar 

  33. Napolitano, M. et al. Interaction-based quantum metrology showing scaling beyond the Heisenberg limit. Nature 471, 486–489 (2011).

    Article  CAS  Google Scholar 

  34. Racine, V. et al. Multiple-target tracking of 3D fluorescent objects based on simulated annealing. In Proc. 3rd IEEE International Symposium on Biomedical Imaging: From Nano to Macro 1020–1023 (IEEE, 2006).

  35. Smith, S. L., Kindermans, P.-J. & Le, Q. V. Don't decay the learning rate, increase the batch size. In Proc. 6th International Conference on Learning Representations (OpenReview, 2018); https://openreview.net/forum?id=B1Yy1BxCZ

Download references

Acknowledgements

We thank all the members of the Nanoorg lab at the Centre for Neuroscience, Indian Institute of Science (IISc) for helpful discussions and comments. We also thank L. Annamalai, V. Ramanathan (C.S.T.’s lab) and A. Adiga (C.S.S.’s lab) for their help and suggestions during the early stages of the project. We thank J.B. Sibarita at the Interdisciplinary Institute for Neuroscience, Bordeaux for sharing the analysis paradigms for single molecule localization and tracking. We also thank B. Jayaprakash, S. P. Arun and S. Devarajan at the Centre for Neuroscience, IISc for helpful discussions. A major part of this work was generously supported by grants from the IISc under the Institute of Eminence (IISc-IOE) programme to C.S.S., C.S.T. and D.N. It was also supported by the Department of Biotechnology (Innovative Young Biotechnologist Award to D.N. and M.J.); Department of Biotechnology Genomics Engineering Taskforce to D.N.; Department of Biotechnology (DBT) Ramalingaswami Fellowship to D.N. and M.J.; DBT–IISc partnership programme to D.N.; IISC-STAR programme grant to D.N.; Science and Engineering Research Board (Early Career Research Award to M.J.); University Grants Commission, India to D.N.; and Tata Trusts, India for the programme grant (co-investigator, D.N.). N.S. thanks the CSIR Senior Research Associateship/Scientists Pool Scheme. C.S.S. and C.S.T. also thank the Pratiksha Trust for their support to procure the neuromorphic cameras. D.N. sincerely thanks colleagues at the Division of Biological Science, IISc for allowing access to older/discarded microscopes either in full or in part for research and development.

Author information

Authors and Affiliations

Authors

Contributions

D.N., C.S.S. and C.S.T. conceived the idea. R.M. and D.N. designed the research; R.M. and/or D.N. performed all the experiments unless otherwise indicated; R.M., N.S. and D.N. performed the analysis; N.S. and M.J. prepared the samples; and C.S.T. and C.S.S. shared neuromorphic devices and analytical tools to optimize and extract information from the neuromorphic camera. R.M., M.J. and D.N. wrote the manuscript. All the authors read, provided critical input on and approved the final version of the manuscript.

Corresponding author

Correspondence to Deepak Nair.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Nanotechnology thanks Boxin Shi, Giovanni Volpe and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Extended data

Extended Data Fig. 1 Scheme for imaging and spatiotemporal analysis of single fluorescent particles.

a) Spatial localization of single fluorescent particles imaged at 10 Hz using an EMCCD camera (EM) using either DeepTrack or a wavelet algorithm. Only frames in which the laser was at Hi intensity were chosen for particle tracking, unless otherwise stated. The frames in which the laser was at Lo intensity were discarded due to low visibility of particles. A scanning box shifts through each frame predicting the centroids of particles (marked with red circles). A 2D frequency distribution fit was used to determine the Gaussian profile of localization. b) Temporal dynamics of a single fluorescent particle. The laser was periodically pulsed between Hi and Lo periods of 100 ms each. The pseudo-coloured intensity map of a single fluorescent particle as imaged by EM shows that alternate frames have high and low intensity fluorescence, corresponding to the laser intensity. The asynchronous data from the NM camera (NM) was accumulated at frame rates of 10 Hz, 20 Hz, 40 Hz, and 100 Hz with positive polarity (ON) in green and negative polarity (OFF) in red. Asynchronous data from NM was accumulated at 200 Hz, and the ON (green) and OFF (red) channels were analyzed separately to study the temporal characteristics of fluorescence. The ON and OFF polarities were summed to study the temporal characteristics of total change in polarity. The accumulated polarity was further integrated in time, which provided the total intensity of the particle. c) Spatial localization of single fluorescent particles imaged using NM. The asynchronous data was accumulated at 10 Hz. Frames with a rising edge of laser intensity show ON events of particles (green). The frames with a falling edge show OFF events of particles (red). The frames with ON (green box) and OFF events (red box) were sorted. A scanning box shifts through each frame predicting the centroids of particles (marked with red and green circles in OFF and ON frames, respectively). A 2D distribution fit determined the Gaussian profile of localization for ON and OFF events separately. Figure created with BioRender.com.

Extended Data Fig. 2 Comparison of the dynamic range of EM and NM.

a) Progressive increase in the bit depth was observed as we increased the electron multiplying gain of the EM obtained from 400 localized single particles. b) The response of the NM at the same laser intensity (300 mW) that captures maximum dynamic range for EM (16 bit). There were no differences in the bit depth when the switching time changed between 20 ms, 50 ms or 100 ms for the ON process in the NM, but the bit depth was decreased with laser power. NM data was acquired in the same illumination conditions where the EM recorded the signals with the highest dynamic range (300 mW output laser power, 28-29 mW at the objective). While on the EM, recording was performed at 10 Hz (100 ms), NM was exposed to increased switching frequency for ON and OFF processes, as well as to high (similar to EM conditions) and low laser powers (150 mW output laser power, 15-16 mW at the objective). N indicates the number of localized ON events from isolated single particles. c) No difference was found in the dynamic range between ON and OFF processes (970 ON/OFF events at 300 mW and 990 (ON) and 980 (OFF) events at 150 mW) of the NM, indicating that both processes were detected with the same efficiency of detection. For all the distributions, the inner horizontal line and boundaries of the box indicate median and interquartile range (IQR), while the diameter (separating navy blue/white circle) in the box and error bars on the box indicate mean and 10-90 % of the data. All the distributions were non-normal by D’Agostino and Pearson Omnibus normality test (for all data p < 0.0001) and thus analyzed for median and interquartile range (IQR), as presented towards the right side of their corresponding distributions.

Source data

Extended Data Fig. 3 Comparison of localization precision and signal to noise ratio between EM and NM.

To examine the dependence of centroid estimation on SNR in an EM, we calculated the localization precision and SNR at various EM gain by taking the ratio of maximum to minimum signal detected in a 15 × 15 pixel2 compartment around the pixel displaying maximum intensity. a) Localization precision in X (red) and Y (green) directions for Hi (circle and solid line) and Lo (pentagon and dotted line) laser powers on the EM, respectively. The variance in error for calculating the centroid of the single fluorescent particle increased with a decrease in EM gain. b) Localization precision in X (red) and Y (green) directions for ON and OFF processes of NM data at ON (pentagon and solid line) and OFF (circle and solid line) states. The variance of localizing centroid increased with decrease in laser power. c) Signal to noise ratio (SNR) of the EM at different EM gain for Hi (circle and solid line) and Lo (pentagon and dotted line) illumination. d) Signal to noise ratio of the NM at different switching powers for ON (solid line) and OFF (dotted line) processes. The recording was done with ON (pentagon) and OFF (circle) laser powers. e) A plot of signal to noise ratio vs localization precision in EM in the X (red circle) and Y (green circle) directions. Dotted red and green line indicate the sigmoidal fit of the data. An increase in signal to noise ratio decreased the variance of detecting the centroid and increased the accuracy of detection. f) A plot of signal to noise ratio vs localization precision in NM for X axis for ON (red circle) and OFF (green circle) processes. Dotted red and green lines indicate the sigmoidal fit for the same in EM. All error bars indicate SD. All EM analysis was performed on 4 molecules. For NM data at 300 mW, 10 and 5 single particles were analyzed for ON process, whereas 9 and 6 particles were detected for OFF process at 100 ms and 50 ms switching time. At 150 mW, 13, 6 and 8 single particles were analyzed for ON process, while 13, 3 and 9 particles were detected for OFF processes at 100 ms, 50 ms and 20 ms switching time.

Source data

Extended Data Fig. 4 Gallery of sample images of EM (top) and NM (bottom) as generated for training of the CNN.

a) The images were generated using the algorithm provided by DeepTrack and training parameters were optimized. The red dots indicate the ground truth centroid coordinated for training the CNN. Each individual panel is 244 × 244 pixel 2. Scale bar indicates 61 pixels. b) The root mean square error in predicting the centroid of the simulated single particles was calculated with respect to the ground truth for each range of SNR and the average root mean square error (data point) and SEM (error bars) for X (green) and Y (red) directions were plotted when sampled at Nyquist rate. X axis indicates the lower and upper range of simulation for SNR for the simulated particles. Black and blue dotted regions indicate the experimental ranges of SNR obtained for immobilized fluorescent particles using NM and EM under similar experimental conditions. 25 images were simulated for each range of SNR, a trained CNN was used to predict the centroid of the particle and the root mean square was calculated for each range of SNR.

Source data

Extended Data Fig. 5 Normalized localization precision of isolated and immobilized sub diffraction sized single fluorescent particles using neuromorphic imaging.

a) Probability density function (p.d.f.) of centroids (normalized to (0,0)) as obtained by DeepTrack for ON events of 15 fluorescent beads over 100 frames accumulated at 100 ms. A 2D Gaussian surface fit determined X2D and Y2D to be 44.73 + /−1.26 nm and 41.70 + /−1.17 nm, respectively. b) Relative frequency distribution (data points) of normalized X and Y coordinates during ON events obtained by DeepTrack for 15 fluorescent beads over 100 frames. A Gaussian amplitude fit (green dotted line) determined X1D to be 46.07 + /−4.37 nm. c) A Gaussian amplitude fitting (green dotted line) determined Y1D to be 42.17 + /−1.61 nm. d) P.d.f. of centroids (normalized to (0,0)) obtained by DeepTrack for OFF events of 15 fluorescent beads over 100 frames accumulated at 100 ms. A 2D Gaussian surface fit estimated X2D and Y2D to be 38.83 + /−1.04 nm and 38.47 + /−1.03 nm, respectively. e) Relative frequency distribution (data points) of normalized x and y coordinates during ON events as obtained by DeepTrack for 15 fluorescent beads over 100 frames. A Gaussian amplitude fit (red dotted line) determined X1D to be 41.53 + /−1.63 nm. F) A Gaussian amplitude fit (green dotted line) determined Y1D to be 40.62 + /−1.78 nm. Centroid estimation in 1D and 2D is presented as mean + /− standard deviation.

Source data

Extended Data Fig. 6 Normalized localization precision of isolated and immobilized sub-diffraction sized single fluorescent particles using EM.

a) P.d.f. of centroids (normalized to (0,0)) obtained by DeepTrack for Hi images from EM of 15 fluorescent beads over 100 frames accumulated at 100 ms. A 2D Gaussian surface fit determined X2D and Y2D to be 15.24 + /−0.11 nm and 14.08 + /−0.15 nm, respectively. b) Relative frequency distribution (data points) of normalized x and y coordinates during ON events obtained by DeepTrack for 15 fluorescent beads over 100 frames. A Gaussian amplitude fit (black dotted line) determined X1D to be 24.59 + /−0.88 nm. c) A Gaussian amplitude fit (black dotted line) determined Y1D to be 23.77 + /−0.70 nm. Centroid estimation in 1D and 2D is presented as mean ± standard deviation.

Source data

Extended Data Fig. 7 Comparison of standard deviation for the ON and OFF processes as detected by NM.

Representative scatter plots (circles) of eight isolated single particles below the limit of diffraction distributed in the field of view. a) Distribution of standard deviation along X (grey) and Y (blue) directions for diffraction limited fluorescence emission from ON processes of neuromorphic detection. b) Distribution of standard deviation along X (grey) and Y (blue) directions for diffraction limited fluorescence emission from OFF processes of neuromorphic detection. For all the distributions, the diagonal and the unconnected top and bottom edges of the diamond box indicate median and IQR, while the smaller horizontal line (navy blue) parallel to the diagonal in the box and error bars indicate mean and 10–90 % of the data. All the distributions passed D’Agostino and Pearson omnibus normality test. All data were thus analyzed for mean + /− standard deviation (SD) and median / interquartile range (IQR) is presented below the corresponding distributions with the p value of the normality test conducted.

Source data

Extended Data Fig. 8 Differential evaluation of temporal sampling of diffusion from EM.

a) Distribution of instantaneous diffusion coefficients derived from the trajectories after differential temporal sampling of data from EM. The notch and boundaries of the box indicate median and IQR, while the line (navy blue) in the box and error bars display mean and 10-90 % of the data. The data acquired at 10 Hz (100 ms), 20 Hz (50 ms) and 50 Hz (20 ms), as colour coded in wine, grey and yellow, respectively were compared. The distribution of data obtained at 10 Hz (100 ms) was significantly different from that at 20 Hz (50 ms) (p value = 0.0293), as assessed by One-way Anova followed by Fisher’s LSD test, in contrast to 50 Hz (20 ms) which remained similar. b) Cumulative frequency distribution of log(D) of the mobile trajectories for different temporal sampling using EM. The distribution of data obtained at 10 Hz (100 ms) was shifted towards lower values when compared to 20 Hz (50 ms), while 50 Hz (20 ms) remained similar. c) A plot of mean square displacement vs time of the mobile trajectories for differential temporal sampling using EM. The MSD is presented for the first 10 points calculated from the mean of the MSDs of all the detected trajectories. Error bar represents the standard error of the mean (SEM). d) The plot represents the zoomed version of the red inset marked in c, corresponding to sampling at very high frequencies. The number of trajectories observed were 39, 31 and 113 for 100 ms, 50 ms and 20 ms, respectively.

Source data

Extended Data Fig. 9 Analysis of trajectories obtained from ON and OFF events from NM are not different.

a) Distribution of instantaneous diffusion coefficient derived from the trajectories of ON and OFF processes after differential temporal sampling of 10 Hz and 20 Hz (corresponding to 100 ms and 50 ms, respectively). The notch and boundaries of the box indicate median and IQR, while the line (navy blue) in the box and error bars indicate mean and 10-90 % of the data. No significant differences using One-way Anova followed by Fisher’s LSD test were reported between ON and OFF process at 10 Hz (dark and light magenta) and 20 Hz (dark and light cyan). b. Cumulative frequency distribution of log(D) of the mobile trajectories of ON and OFF processes at 10 Hz (dark and light magenta) and 20 Hz (dark and light cyan). The number of trajectories observed were 86 (50 ms, ON), 53 (100 ms, ON), 103 (50 ms, OFF) and 37 (100 ms, OFF).

Source data

Extended Data Fig. 10 Event based detection of single fluorescent particle trajectories using neuromorphic imaging.

a) Normalized trajectories of ON (green) and OFF (red) events of a confined particle (particle 1). A single fluorescent particle was tracked using NM and the corresponding asynchronous data generated was accumulated at 10 ms per frame for 100 frames. The frames were split into RGB channels and the DeepTrack neural network was used to track the position of the particles in the green (ON events signifying new position of particle in that frame) and red (OFF events signifying the initial position of the particle in that frame) channels separately. a, b) Normalized trajectories of ON events (green) and OFF events (red) of a confined (Particle 1) and diffuse particle (particle 2). c, d) Vector loss between consecutive frames. Vectors pointing from the centroids of ON events in one frame to the corresponding centroids of OFF events in the subsequent frame for confined and diffuse particles. e, f) Mean square displacement (MSD) of confined and diffuse particles. The MSD was calculated for the ON (green) and OFF (red) events independently using their respective trajectories. The MSDs of the two channels showed high correlation. g) Frequency distribution of step sizes of single fluorescent particle trajectories. The step size of an isolated particle was calculated as the distance between centroids of OFF and ON events in one frame to those in the subsequent frame. The step sizes of both particles were distributed into bin sizes of 100 nm. The distribution of step sizes for particle 2 (G(i)) was right shifted (showing a higher number of larger steps) as compared to particle 1(G(ii)), which was more confined. h) Cumulative frequency distribution of step sizes of particles 1 (circle) and 2 (hexagon). The diffused particle had a larger slope than the confined one due to larger step size.

Source data

Supplementary information

Supplementary Information

Supplementray Tables 1–5 and video legends.

Reporting Summary

Supplementary Video 1

A visual demonstration of the analysis pipeline for the localization of ON and OFF events recorded from diffraction-limited objects at nanoscale precision (1–55 s); the generation of single particle trajectories at millisecond timescales (55–65 s) is shown.

Source data

Source Data Fig. 2

Source data for Fig. 2.

Source Data Fig. 4

Source data for Fig. 4.

Source Data Fig. 5

Source data for Fig. 5.

Source Data Extended Data Fig. 2

Source data for Extended Data Fig. 2.

Source Data Extended Data Fig. 3

Source data for Extended Data Fig. 3.

Source Data Extended Data Fig. 4

Source data for Extended Data Fig. 4.

Source Data Extended Data Fig. 5

Source data for Extended Data Fig. 5.

Source Data Extended Data Fig. 6

Source data for Extended Data Fig. 6.

Source Data Extended Data Fig. 7

Source data for Extended Data Fig. 7.

Source Data Extended Data Fig. 8

Source data for Extended Data Fig. 8.

Source Data Extended Data Fig. 9

Source data for Extended Data Fig. 9.

Source Data Extended Data Fig. 10

Source data for Extended Data Fig. 10.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mangalwedhekar, R., Singh, N., Thakur, C.S. et al. Achieving nanoscale precision using neuromorphic localization microscopy. Nat. Nanotechnol. 18, 380–389 (2023). https://doi.org/10.1038/s41565-022-01291-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41565-022-01291-1

This article is cited by

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing