Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Technical Report
  • Published:

Real-time analysis of large-scale neuronal imaging enables closed-loop investigation of neural dynamics

Abstract

Large-scale imaging of neuronal activities is crucial for understanding brain functions. However, it is challenging to analyze large-scale imaging data in real time, preventing closed-loop investigation of neural circuitry. Here we develop a real-time analysis system with a field programmable gate array–graphics processing unit design for an up to 500-megabyte-per-second image stream. Adapted to whole-brain imaging of awake larval zebrafish, the system timely extracts activity from up to 100,000 neurons and enables closed-loop perturbations of neural dynamics.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Stable real-time analysis of optical imaging of large-scale neuronal activities.
Fig. 2: Real-time extraction of brain-wide neuronal activities enables ensemble activity-triggered optogenetics.
Fig. 3: Real-time analysis of brain-wide neuronal activities enables experiments of closed-loop sensory stimulation and VR.

Similar content being viewed by others

Data availability

The data supporting the findings of this study are available from Google Drive (https://drive.google.com/drive/folders/1zeSDpireSMZxfUCuUUNZBJPD0avNSYoX?usp=drive_link).

Code availability

A customized LabVIEW program (v.2018 sp1) for system control, verilog codes (programmed in vivado 16.1) for data stream acquisition, image assembly and dumping using FPGA, C++ (programed in g++ 5.4 or above) and CUDA codes (programed with CUDA driver 440.64.00 and CUDA toolkit 40.2) for image registration using GPU and MATLAB codes (2016b on Ubuntu) for data analysis are all available from GitHub (https://github.com/cfshang/Real-time_Large-scale_Feedback).

References

  1. Grosenick, L., Marshel, J. H. & Deisseroth, K. Closed-loop and activity-guided optogenetic control. Neuron 86, 106–139 (2015).

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  2. Chen, Z. S. & Pesaran, B. Improving scalability in systems neuroscience. Neuron 109, 1776–1790 (2021).

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  3. Cardin, J. A., Crair, M. C. & Higley, M. J. Mesoscopic imaging: shining a wide light on large-scale neural dynamics. Neuron 108, 33–43 (2020).

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  4. Kim, T. H. & Schnitzer, M. J. Fluorescence imaging of large-scale neural ensemble dynamics. Cell 185, 9–41 (2021).

    Article  Google Scholar 

  5. Zhang, Z. et al. Imaging volumetric dynamics at high speed in mouse and zebrafish brain with confocal light field microscopy. Nat. Biotechnol. 39, 74–83 (2021).

    Article  CAS  PubMed  Google Scholar 

  6. Ren, C. & Komiyama, T. Characterizing cortex-wide dynamics with wide-field calcium imaging. J. Neurosci. 41, 4160–4168 (2021).

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  7. Urai, A. E., Doiron, B., Leifer, A. M. & Churchland, A. K. Large-scale neural recordings call for new insights to link brain and behavior. Nat. Neurosci. 25, 11–19 (2022).

    Article  CAS  PubMed  Google Scholar 

  8. Ahrens, M. B., Orger, M. B., Robson, D. N., Li, J. M. & Keller, P. J. Whole-brain functional imaging at cellular resolution using light-sheet microscopy. Nat. Methods 10, 413–420 (2013).

    Article  CAS  PubMed  Google Scholar 

  9. Vladimirov, N. et al. Light-sheet functional imaging in fictively behaving zebrafish. Nat. Methods 15, 1117–1125 (2018).

    Article  CAS  PubMed  Google Scholar 

  10. Cong, L. et al. Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio). eLife 6, e28158 (2017).

    Article  PubMed Central  PubMed  Google Scholar 

  11. Golub, M. D., Chase, S. M., Batista, A. P. & Yu, B. M. Brain-computer interfaces for dissecting cognitive processes underlying sensorimotor control. Curr. Opin. Neurobiol. 37, 53–58 (2016).

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  12. Peixoto, D. et al. Decoding and perturbing decision states in real time. Nature 591, 604–609 (2021).

    Article  ADS  CAS  PubMed  Google Scholar 

  13. Scangos, K. W. et al. Closed-loop neuromodulation in an individual with treatment-resistant depression. Nat. Med. 27, 1696–1700 (2021).

    Article  CAS  PubMed  Google Scholar 

  14. Clancy, K. B., Koralek, A. C., Costa, R. M., Feldman, D. E. & Carmena, J. M. Volitional modulation of optically recorded calcium signals during neuroprosthetic learning. Nat. Neurosci. 17, 807–809 (2014).

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  15. Zhang, Z., Russell, L. E., Packer, A. M., Gauld, O. M. & Häusser, M. Closed-loop all-optical interrogation of neural circuits in vivo. Nat. Methods 15, 1037–1040 (2018).

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  16. Trautmann, E. M. et al. Dendritic calcium signals in rhesus macaque motor cortex drive an optical brain-computer interface. Nat. Commun. 12, 3689 (2021).

    Article  ADS  CAS  PubMed Central  PubMed  Google Scholar 

  17. CHIME/FRB Collaboration. A bright millisecond-duration radio burst from a Galactic magnetar. Nature 587, 54–58 (2020).

    Article  ADS  Google Scholar 

  18. Amiri, M. et al. The CHIME Fast Radio Burst Project: system overview. Astrophys. J. 863, 48 (2018).

    Article  ADS  Google Scholar 

  19. Giovannucci, A. et al. CaImAn: an open source tool for scalable calcium imaging data analysis. eLife 8, e38173 (2019).

    Article  PubMed Central  PubMed  Google Scholar 

  20. Sara, S. J. & Bouret, S. Orienting and reorienting: the locus coeruleus mediates cognition through arousal. Neuron 76, 130–141 (2012).

    Article  CAS  PubMed  Google Scholar 

  21. Breton-Provencher, V. & Sur, M. Active control of arousal by a locus coeruleus GABAergic circuit. Nat. Neurosci. 22, 218–228 (2019).

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  22. Dunn, T. W. et al. Neural circuits underlying visually evoked escapes in larval zebrafish. Neuron 89, 613–628 (2016).

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  23. Portugues, R. & Engert, F. Adaptive locomotor behavior in larval zebrafish. Front. Syst. Neurosci. 5, 72 (2011).

    Article  PubMed Central  PubMed  Google Scholar 

  24. Ahrens, N. et al. Brain-wide neuronal dynamics during motor adaptation in zebrafish. Nature 485, 471–477 (2012).

    Article  ADS  CAS  PubMed Central  PubMed  Google Scholar 

  25. Vladimirov, N. et al. Light-sheet functional imaging in fictively behaving zebrafish. Nat. Methods 11, 883–884 (2014).

    Article  CAS  PubMed  Google Scholar 

  26. Jiao, Z. F. et al. All-optical imaging and manipulation of whole-brain neuronal activities in behaving larval zebrafish. Biomed. Opt. Exp. 9, 6154–6169 (2018).

    Article  CAS  Google Scholar 

  27. Beucher, S. The watershed transformation applied to image segmentation. Scanning Microsc. S6, 299–314 (1992).

    Google Scholar 

  28. Panier et al. Fast functional imaging of multiple brain regions in intact zebrafish larvae using selective plane illumination microscopy. Front. Neural Circuits 7, 65 (2013).

    Article  PubMed Central  PubMed  Google Scholar 

Download references

Acknowledgements

This work was supported by the National Science and Technology Innovation 2030 Major Program of the Ministry of Science and Technology (grant no. 2021ZD0204500 to J.-L.D., grant no. 2021ZD0204502 to J.-L.D., grant no. 2021ZD0203704 to Y.M.), the Creative Research Groups (grant no. 32321003 to J.-L.D.) of the National Natural Science Foundation of China, the National Key Basic Research Program of China (grant no. 2021YFF0502900 to C.-F.S), the Key Research Program of Frontier Sciences (grant no. QYZDYSSW-SMC028 to J.-L.D.) and the Strategic Priority Research Program (XDB32000000 to J.-L.D.) of the Chinese Academy of Sciences, the Shanghai Municipal Science and Technology Major Project (grant nos. 18JC1410100 and 2018SHZDZX05 to J.-L.D.), the General Program (grant no. 32070982 to C.-F.S., grant no. 32171026 to Y.M.) of the National Natural Science Foundation of China, the General Program (grant no. 20200813170229001 to C.-F.S.) of the Science, Technology and Innovation Commission of Shenzhen Municipality and the Scientific Instrument Developing Project of the Chinese Academy of Sciences (grant no. YJKYYQ20210029 to Y.M.).

Author information

Authors and Affiliations

Authors

Contributions

J.-L.D., J.H., Y.M. and C.-F.S. supervised the research. J.-L.D., Y.M., C.-F.S., Y.-F.W. and Y.Q. wrote the paper. C.-F.S., Y.-F.W. and M.-T.Z. performed all the experiments. C.-F.S., M.-T.Z., Q.-X.F. and J.H. designed the real-time analysis system, and M.-T.Z. and Q.-X.F. set it up. S.Z. and Y.Q. helped with the virtual reality experiments. S.-J.X. participated in the data analysis.

Corresponding authors

Correspondence to Yu Mu, Jie Hao or Jiu-Lin Du.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Neuroscience thanks German Sumbre and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Extended data

Extended Data Fig. 1 System layout (a) and detailed hardware setup (b) of the real-time analysis.

The ‘System controller’ is a customized LabView program running on a high-speed analog output module (SPIM1, PXI-6733, National Instruments; SPIM2, PXIe-6738, National Instruments). This program synchronizes the real-time analysis with the light-sheet microscope operation (1) and the sCMOS camera (ORCA Flash 4.0) (2). It sends the layer ID (the axial position of the current image in the scanned volume) (3) to the FPGA board. The ‘Frame grabber’, which is packaged together with the ‘FPGA board’, receives the data stream from the sCMOS (4) and transmits it to the ‘FPGA board’ (5). The data stream consists of two-line data packages, where each package contains data from simultaneously exposed lines, as each frame is exposed from the middle to the two edges. The FPGA board assembles the two-line data packages into image frames and appends the layer ID to each frame (3, 5). The assembled images are sent to the ‘Real-time workstation’ with two GPUs installed for real-time registration and signal extraction (6), and to the ‘Real-time image storage’ (7) via 10 Gigabit ports. The ‘sCMOS controller PC’ communicates with the ‘FPGA board’ and the ‘Frame grabber’ via a USB Gigabit port (8-10) to set the sCMOS camera in the ‘Synchronous readout’ mode. Separate PCs are dedicated to real-time registration, storage, sCMOS control, and frame triggering to ensure precise timing of the system components.

Extended Data Fig. 2 Schematic structure of the FPGA board and its program.

a, The FPGA board consists of modules for image acquisition and pre-processing. The image acquisition daughterboard is connected via the FMC HPC connector. The received image data are cached at the QDRII, with a current capacity of 18 MB. The system utilizes a ping-pong buffering mechanism, where two consecutive frames are processed at each time. Therefore, the cache capacity must be equal to or larger than the size of two frames, which amounts to 16 MB. The image data is then transmitted out through the two QSFP connectors via 10 Gigabit Ethernet. The GPIO (General Purpose Input/Output) is used for serial communication. The QSPI FLASH stores the FPGA configuration memory file. All the modules involved in image acquisition and pre-processing are highlighted in red. b, The FPGA board program includes specific modules for serial port communication and data acquisition. The serial port module receives serial data from host PC and forwards it to the sCMOS through an FMC connector. The data acquisition module parses image data stream obtained with the CameraLink protocol. The data cache module buffers the image data stream and assembles images from it. Finally, the data forwarding module splits the images into two and forwards them as separate streams.

Extended Data Fig. 3 Real-time extraction of large-scale neuronal activities enables ensemble activity-triggered feedback.

a, Experimental design of ensemble activity-triggered closed-loop experiments. Brain-wide neurons are segmented from the reference stack. K-means clustering of their spontaneous activities generated cluster IDs and weights assigned to the neurons. The ensemble activity is calculated as the weighted average of real-time (RT) extracted neuronal activities (red dash box and black trace), and it triggers feedback stimuli, including flash, optogenetic stimuli, or virtual reality (VR) (red trace). b, Distribution of neuronal ensembles generated by k-means clustering of neuronal activities. Scale bar, 100 μm. c, The weighted average of neuronal activities in a selected seed cluster (cluster 9) triggers flash stimulation when surpassing a pre-set threshold. Red dashed line indicates the flash onset, and the green dashed line represents the triggering threshold. Note that a minimum interval of 1 min between nearest flash stimuli is required. d, Heatmap illustrating flash-evoked neuronal activities across the brain. e,f, Flash-evoked trial-averaged response of the seed cluster (cluster 9, e) and other representative clusters (f). Green dashed line represents the triggering threshold, red dashed lines indicate the onset and offset time of the flash, and grey dashed lines indicate standard error of the mean (SEM).

Extended Data Fig. 4 Real-time calculation of the synchronous activity of LC-NE neurons.

Activities of population LC-NE neurons were extracted from the pre-determined ROIs in real time (top). The inter-neuronal variance was calculated in real time (middle), similarly to the variance calculated offline (bottom).

Extended Data Fig. 5 Real-time extraction of brain-wide neuronal activities and ensemble-triggered virtual reality experiments.

a, Experimental design of ensemble activity-triggered feedback control. A reference frame was generated by averaging 50 frames, and regions of interest (ROIs) corresponding to neurons across the imaged plane were segmented using a watershed algorithm. The larva then experienced locomotor-triggered virtual reality, while activities were extracted in real-time (RT) for each ROI. K-means clustering were performed to group the neurons into distinct clusters within 10 min. Different weights were assigned to each neuron based on similarity of their activities to the cluster mean, and the weighted average from all neurons were calculated to represent the ensemble activity. A seed cluster, with the ensemble activity correlated with the swimming were selected to trigger VR (red dash box). In ensemble activity-triggered virtual reality experiment, the ensemble activity (red trace) directly controlled the grating velocity presented to the zebrafish. Neuronal activities from all the groups were displayed and updated in real time during the experiment (inset). OMR, optomotor response. b, Anatomical locations of the neurons composing each cluster are shown in the plane in focus. The cluster identity is color-coded. Scale bar, 100 μm. c, Activities from all neurons across the imaging plane for VR, ordered by the cluster ID. d, Anatomical locations of the neurons composing the motor-related seed cluster (located in the cerebellum, CB, and hindbrain, HB), and a randomly selected neuron ensemble with the same number of neurons as the seed cluster are shown using different colors. Scale bar, 100 μm. e,f, Gain adaptation in closed-loop control by the population neuronal activity of the seed cluster (e), and the randomly selected ensemble (f). The frequency and amplitude of the seed cluster neuronal activities (upper and middle panels) in the low gain condition were significantly larger than those in the high gain condition (f). In contrast, such modulation effect was not absent with the closed-loop control by the randomly selected ensemble (f). g,h, There was no significant difference in the modulation of activity frequency (Frequency ratio between low and high gain, g) or grating velocity (Grating velocity ratio between high and low gain, h) between the neuronal ensemble activity-triggered (“Neuron”) and locomotor-triggered (“Locomotor”) closed-loop controls. Two-sided paired t-test, n = 9 fish. In the box plots, the central lines mark the medians, the box limits mark the upper and lower quartiles, and the whiskers mark the ±1.5× interquartile range. Data from each individual fish were marked with filled circles and connected with a line.

Extended Data Fig. 6 Comparison of real-time registration and non-rigid ANTs registration.

a, Example frames showing misalignment caused by motor movements without registration. Scale bar, 100 μm. b, Frames during motor movements compared under conditions without registration, with real-time registration, and offline registration by non-rigid ANTs. Scale bar, 50 μm. c, Quantification of image similarity with the reference image. The real-time registration and non-rigid ANTs registration enhanced the image similarity from 0.93 ± 0.01 to 0.96 ± 0.01 and 0.98 ± 0.001, respectively (mean±S.E.M.). *, P = 0.03, two-sided Wilcoxon’s signed rank test, n = 6 registered images. In the box plots, the central line, the box limits, and the whiskers, marks the median, the upper and lower quartiles, and the 1.5 interquartile range, respectively. Similarity quantified for each individual registration was marked with filled circles and the results on the reference same image are connected with a line. d, Evaluation of movement-induced artifacts without or with real-time registration.

Supplementary information

Supplementary Information

Supplementary Tables 1–5.

Reporting Summary

Supplementary Video 1

Display of real-time extracted whole-brain neuronal activities during ensemble activity-triggered virtual reality. Neurons across the brain were grouped into 12 clusters using k-means clustering based on their activities during locomotor drive-triggered virtual reality (bottom left). Under ‘neuron-control’, the population activity of a swimming-relevant ensemble (Brain Area 11, seed cluster) directly controlled the grating velocity presented to the zebrafish (top left). Neuronal activities from all the clusters were displayed and updated in real time during the experiment (right). In the low-gain condition, both the frequency and amplitude of the seed cluster neuronal activities were higher than those in the high-gain condition. The gain levels alternated between high and low every 30 s. The video is played at normal speed (1×).

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shang, CF., Wang, YF., Zhao, MT. et al. Real-time analysis of large-scale neuronal imaging enables closed-loop investigation of neural dynamics. Nat Neurosci (2024). https://doi.org/10.1038/s41593-024-01595-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1038/s41593-024-01595-6

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing