Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Resource
  • Published:

BigNeuron: a resource to benchmark and predict performance of algorithms for automated tracing of neurons in light microscopy datasets

Abstract

BigNeuron is an open community bench-testing platform with the goal of setting open standards for accurate and fast automatic neuron tracing. We gathered a diverse set of image volumes across several species that is representative of the data obtained in many neuroscience laboratories interested in neuron tracing. Here, we report generated gold standard manual annotations for a subset of the available imaging datasets and quantified tracing quality for 35 automatic tracing algorithms. The goal of generating such a hand-curated diverse dataset is to advance the development of tracing algorithms and enable generalizable benchmarking. Together with image quality features, we pooled the data in an interactive web application that enables users and developers to perform principal component analysis, t-distributed stochastic neighbor embedding, correlation and clustering, visualization of imaging and tracing data, and benchmarking of automatic tracing algorithms in user-defined data subsets. The image quality metrics explain most of the variance in the data, followed by neuromorphological features related to neuron size. We observed that diverse algorithms can provide complementary information to obtain accurate results and developed a method to iteratively combine methods and generate consensus reconstructions. The consensus trees obtained provide estimates of the neuron structure ground truth that typically outperform single algorithms in noisy datasets. However, specific algorithms may outperform the consensus tree strategy in specific imaging conditions. Finally, to aid users in predicting the most accurate automatic tracing results without manual annotations for comparison, we used support vector machine regression to predict reconstruction quality given an image volume and a set of automatic tracings.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Overview of the BigNeuron project and how the community can interact with it.
Fig. 2: Variance in the datasets is explained by both image quality and tree morphology features.
Fig. 3: Consensus provides best estimates of neuronal structure.
Fig. 4: Support vector machine regression for predicting best algorithm performance.
Fig. 5: Showcase of best algorithm prediction in fMOST data.

Similar content being viewed by others

Data availability

3D image volumes of the Gold166 dataset and gold standard reconstructions are available at http://web.bii.a-star.edu.sg/bigneuron/gold166.zip. Bench-testing automated reconstructions can be downloaded from https://github.com/BigNeuron/Data/releases/tag/gold166_bt_v1.0. The fMOST showcase datasets can be found at https://zenodo.org/record/7556104 ref. 93. The complete set of image volumes gathered throughout the project, amounting to ~4 TB of data, is available upon request. Databases of the Allen Mouse and Human Cell Types projects (http://celltypes.brain-map.org/), Taiwan FlyCircuits (http://www.flycircuit.tw/), and Janelia FlyLight (https://www.janelia.org/project-team/flylight) can be found in the given links. Source data are provided with this paper.

Code availability

The source code developed is released as open source and is available at https://github.com/lmanubens/BigNeuron. The Shiny web app can be used at https://linusmg.shinyapps.io/BigNeuron_Gold166/ and https://neuroxiv.net/bigneuron/. The Shiny app source code can be found at https://github.com/lmanubens/BigNeuron/tree/main/shiny_app ref. 94. With a slightly revised MIT license, see BigNeuron Shiny app license in the Supplementary Note . Source code of automated reconstruction algorithms developed throughout the project can be found at https://github.com/Vaa3D/vaa3d_tools/tree/master/released_plugins/v3d_plugins. The Vaa3D plugins license is also a slightly revised MIT license that can be found at: https://github.com/Vaa3D/vaa3d_tools/blob/master/LICENSE. The source code for the consensus tree algorithm, licensed as a Vaa3D plugin, is publicly available at https://github.com/Vaa3D/vaa3d_tools/tree/master/hackathon/xiaoxiaol/consensus_skeleton_2.

References

  1. Meijering, E. Neuron tracing in perspective. Cytometry A 77, 693–704 (2010).

    PubMed  Google Scholar 

  2. Parekh, R. & Ascoli, G. A. Neuronal morphology goes digital: a research hub for cellular and system neuroscience. Neuron 77, 1017–1038 (2013).

    CAS  PubMed  PubMed Central  Google Scholar 

  3. Capowski, J. J. An automatic neuron reconstruction system. J. Neurosci. Methods 8, 353–364 (1983).

    CAS  PubMed  Google Scholar 

  4. Senft, S. L. A brief history of neuronal reconstruction. Neuroinformatics 9, 119–128 (2011).

    PubMed  Google Scholar 

  5. Cai, D., Cohen, K. B., Luo, T., Lichtman, J. W. & Sanes, J. R. Improved tools for the Brainbow toolbox. Nat. Methods 10, 540–547 (2013).

    CAS  PubMed  PubMed Central  Google Scholar 

  6. Nern, A., Pfeiffer, B. D. & Rubin, G. M. Optimized tools for multicolor stochastic labeling reveal diverse stereotyped cell arrangements in the fly visual system. Proc. Natl Acad. Sci. USA 112, E2967–E2976 (2015).

    CAS  PubMed  PubMed Central  Google Scholar 

  7. Daigle, T. L. et al. A suite of transgenic driver and reporter mouse lines with enhanced brain-cell-type targeting and functionality. Cell 174, 465–480 (2018).

    CAS  PubMed  PubMed Central  Google Scholar 

  8. Chung, K. & Deisseroth, K. CLARITY for mapping the nervous system. Nat. Methods 10, 508–513 (2013).

    CAS  PubMed  Google Scholar 

  9. Hama, H. et al. Scale: a chemical approach for fluorescence imaging and reconstruction of transparent mouse brain. Nat. Neurosci. 14, 1481–1488 (2011).

    CAS  PubMed  Google Scholar 

  10. Huisken, J., Swoger, J., Del Bene, F., Wittbrodt, J. & Stelzer, E. H. K. Optical sectioning deep inside live embryos by selective plane illumination microscopy. Science 305, 1007–1009 (2004).

    CAS  PubMed  Google Scholar 

  11. Verveer, P. J. et al. High-resolution three-dimensional imaging of large specimens with light sheet-based microscopy. Nat. Methods 4, 311–313 (2007).

    CAS  PubMed  Google Scholar 

  12. Li, A. et al. Micro-optical sectioning tomography to obtain a high-resolution atlas of the mouse brain. Science 330, 1404–1408 (2010).

    CAS  PubMed  Google Scholar 

  13. Chiang, A.-S. et al. Three-dimensional reconstruction of brain-wide wiring networks in Drosophila at single-cell resolution. Curr. Biol. 21, 1–11 (2011).

    CAS  PubMed  Google Scholar 

  14. Meissner, G. W. et al. A searchable image resource of Drosophila GAL4-driver expression patterns with single neuron resolution. eLife 12, e80660 (2023).

    PubMed  PubMed Central  Google Scholar 

  15. Markram, H. The blue brain project. Nat. Rev. Neurosci. 7, 153–160 (2006).

    CAS  PubMed  Google Scholar 

  16. Ecker, J. R. et al. The BRAIN Initiative Cell Census Consortium: lessons learned toward generating a comprehensive brain cell atlas. Neuron 96, 542–557 (2017).

    CAS  PubMed  PubMed Central  Google Scholar 

  17. Stockton, D. B. & Santamaria, F. Integrating the Allen Brain Institute Cell Types Database into automated neuroscience workflow. Neuroinformatics 15, 333–342 (2017).

    PubMed  PubMed Central  Google Scholar 

  18. Yamasaki, T., Isokawa, T., Matsui, N., Ikeno, H. & Kanzaki, R. Reconstruction and simulation for three-dimensional morphological structure of insect neurons. Neurocomputing 69, 1043–1047 (2006).

    Google Scholar 

  19. Wang, Y., Narayanaswamy, A., Tsai, C.-L. & Roysam, B. A broadly applicable 3-D neuron tracing method based on open-curve snake. Neuroinformatics 9, 193–217 (2011).

    PubMed  Google Scholar 

  20. Zhao, T. et al. Automated reconstruction of neuronal morphology based on local geometrical and global structural models. Neuroinformatics 9, 247–261 (2011).

    PubMed  PubMed Central  Google Scholar 

  21. Xiao, H. & Peng, H. APP2: automatic tracing of 3D neuron morphology based on hierarchical pruning of a gray-weighted image distance-tree. Bioinformatics 29, 1448–1454 (2013).

    CAS  PubMed  PubMed Central  Google Scholar 

  22. Santamaría-Pang, A., Hernandez-Herrera, P., Papadakis, M., Saggau, P. & Kakadiaris, I. A. Automatic morphological reconstruction of neurons from multiphoton and confocal microscopy images using 3D tubular models. Neuroinformatics 13, 297–320 (2015).

    PubMed  Google Scholar 

  23. Peng, H. et al. Automatic tracing of ultra-volumes of neuronal images. Nat. Methods 14, 332–333 (2017).

    CAS  PubMed  PubMed Central  Google Scholar 

  24. Winnubst, J. et al. Reconstruction of 1,000 projection neurons reveals new cell types and organization of long-range connectivity in the mouse brain. Cell 179, 268–281 (2019).

    CAS  PubMed  PubMed Central  Google Scholar 

  25. Peng, H. et al. Morphological diversity of single neurons in molecularly defined cell types. Nature 598, 174–181 (2021).

    CAS  PubMed  PubMed Central  Google Scholar 

  26. Zhong, Q. et al. High-definition imaging using line-illumination modulation microscopy. Nat. Methods 18, 309–315 (2021).

    CAS  PubMed  Google Scholar 

  27. Shillcock, J. C., Hawrylycz, M., Hill, S. & Peng, H. Reconstructing the brain: from image stacks to neuron synthesis. Brain Inform. 3, 205–209 (2016).

    PubMed  PubMed Central  Google Scholar 

  28. Peng, H., Meijering, E. & Ascoli, G. A. From DIADEM to BigNeuron. Neuroinformatics 13, 259–260 (2015).

    PubMed  PubMed Central  Google Scholar 

  29. Gillette, T. A., Brown, K. M. & Ascoli, G. A. The DIADEM metric: comparing multiple reconstructions of the same neuron. Neuroinformatics 9, 233–245 (2011).

    PubMed  PubMed Central  Google Scholar 

  30. Peng, H. et al. BigNeuron: large-scale 3D neuron reconstruction from optical microscopy images. Neuron 87, 252–256 (2015).

    CAS  PubMed  PubMed Central  Google Scholar 

  31. Stockley, E. W., Cole, H. M., Brown, A. D. & Wheal, H. V. A system for quantitative morphological measurement and electronic modelling of neurons: three-dimensional reconstruction. J. Neurosci. Methods 47, 39–51 (1993).

    CAS  PubMed  Google Scholar 

  32. Polavaram, S., Gillette, T. A., Parekh, R. & Ascoli, G. A. Statistical analysis and data mining of digital reconstructions of dendritic morphologies. Front. Neuroanat. 8, 138 (2014).

    PubMed  PubMed Central  Google Scholar 

  33. Akram, M. A., Nanda, S., Maraver, P., Armañanzas, R. & Ascoli, G. A. An open repository for single-cell reconstructions of the brain forest. Sci. Data 5, 180006 (2018).

    PubMed  PubMed Central  Google Scholar 

  34. Bird, A. D. & Cuntz, H. Dissecting Sholl analysis into its functional components. Cell Rep. 27, 3081–3096 (2019).

    CAS  PubMed  Google Scholar 

  35. Forbes, C., Evans, M., Hastings, N. & Peacock, B. Statistical Distributions (Wiley Hoboken, 2011).

  36. Schwarz, G. Estimating the dimension of a model. Ann. Stat. 6, 461–464 (1978).

    Google Scholar 

  37. Wang, C.-W., Lee, Y.-C., Pradana, H., Zhou, Z. & Peng, H. Ensemble neuron tracer for 3D neuron reconstruction. Neuroinformatics 15, 185–198 (2017).

    PubMed  Google Scholar 

  38. Muñoz-Castañeda, R. et al. Cellular anatomy of the mouse primary motor cortex. Nature 598, 159–166 (2021).

    PubMed  PubMed Central  Google Scholar 

  39. Zheng, T. et al. Visualization of brain circuits using two-photon fluorescence micro-optical sectioning tomography. Opt. Express 21, 9839–9850 (2013).

    PubMed  Google Scholar 

  40. Jiang, S. et al. Petabyte-scale multi-morphometry of single neurons for whole brains. Neuroinformatics 20, 525–536 (2022).

    PubMed  Google Scholar 

  41. Alivisatos, A. P. et al. The Brain Activity Map Project and the challenge of functional connectomics. Neuron 74, 970–974 (2012).

    CAS  PubMed  PubMed Central  Google Scholar 

  42. Kandel, E. R., Markram, H., Matthews, P. M., Yuste, R. & Koch, C. Neuroscience thinks big (and collaboratively). Nat. Rev. Neurosci. 14, 659–664 (2013).

    CAS  PubMed  Google Scholar 

  43. Costa, M., Manton, J. D., Ostrovsky, A. D., Prohaska, S. & Jefferis, G. NBLAST: rapid, sensitive comparison of neuronal structure and construction of neuron family databases. Neuron 91, 293–311 (2016).

    CAS  PubMed  PubMed Central  Google Scholar 

  44. Kanari, L. et al. A topological representation of branching neuronal morphologies. Neuroinformatics 16, 3–13 (2018).

    PubMed  Google Scholar 

  45. Meijering, E., Carpenter, A. E., Peng, H., Hamprecht, F. A. & Olivo-Marin, J.-C. Imagining the future of bioimage analysis. Nat. Biotechnol. 34, 1250–1255 (2016).

    CAS  PubMed  Google Scholar 

  46. Zeng, H. & Sanes, J. R. Neuronal cell-type classification: challenges, opportunities and the path forward. Nat. Rev. Neurosci. 18, 530–546 (2017).

    CAS  PubMed  Google Scholar 

  47. Treweek, J. B. et al. Whole-body tissue stabilization and selective extractions via tissue-hydrogel hybrids for high-resolution intact circuit mapping and phenotyping. Nat. Protoc. 10, 1860–1896 (2015).

    CAS  PubMed  PubMed Central  Google Scholar 

  48. Ke, M.-T. et al. Super-resolution mapping of neuronal circuitry with an index-optimized clearing agent. Cell Rep. 14, 2718–2732 (2016).

    CAS  PubMed  Google Scholar 

  49. Gong, H. et al. High-throughput dual-colour precision imaging for brain-wide connectome with cytoarchitectonic landmarks at the cellular level. Nat. Commun. 7, 12142 (2016).

    CAS  PubMed  PubMed Central  Google Scholar 

  50. Li, Y., Wang, D., Ascoli, G. A., Mitra, P. & Wang, Y. Metrics for comparing neuronal tree shapes based on persistent homology. PLoS One 12, e0182184 (2017).

    PubMed  PubMed Central  Google Scholar 

  51. Ljungquist, B., Akram, M. A. & Ascoli, G. A. Large scale similarity search across digital reconstructions of neural morphology. Neurosci. Res. 181, 39–45 (2022).

    PubMed  Google Scholar 

  52. Maier-Hein, L. et al. Why rankings of biomedical image analysis competitions should be interpreted with care. Nat. Commun. 9, 5217 (2018).

    CAS  PubMed  PubMed Central  Google Scholar 

  53. Li, R. et al. Precise segmentation of densely interweaving neuron clusters using G-Cut. Nat. Commun. 10, 1549 (2019).

    PubMed  PubMed Central  Google Scholar 

  54. Wang, Y. et al. TeraVR empowers precise reconstruction of complete 3-D neuronal morphology in the whole brain. Nat. Commun. 10, 3474 (2019).

    PubMed  PubMed Central  Google Scholar 

  55. Li, R., Zeng, T., Peng, H. & Ji, S. Deep learning segmentation of optical microscopy images improves 3-D neuron reconstruction. IEEE Trans. Med. Imaging 36, 1533–1541 (2017).

    PubMed  Google Scholar 

  56. Liu, S., Zhang, D., Song, Y., Peng, H. & Cai, W. Automated 3-D neuron tracing with precise branch erasing and confidence controlled back tracking. IEEE Trans. Med. Imaging 37, 2441–2452 (2018).

    PubMed  Google Scholar 

  57. Gu, L. et al. Semi-supervised learning in medical images through graph-embedded random forest. Front. Neuroinform. 14, 601829 (2020).

    PubMed  PubMed Central  Google Scholar 

  58. Radojević, M. & Meijering, E. Automated neuron reconstruction from 3D fluorescence microscopy images using sequential Monte Carlo estimation. Neuroinformatics 17, 423–442 (2019).

    PubMed  Google Scholar 

  59. Pfeiffer, B. D. et al. Tools for neuroanatomy and neurogenetics in Drosophila. Proc. Natl Acad. Sci. USA 105, 9715–9720 (2008).

    CAS  PubMed  PubMed Central  Google Scholar 

  60. Nanda, S., Das, R., Bhattacharjee, S., Cox, D. N. & Ascoli, G. A. Morphological determinants of dendritic arborization neurons in Drosophila larva. Brain Struct. Funct. 223, 1107–1120 (2018).

    PubMed  Google Scholar 

  61. Ikeno, H. et al. Development of a scheme and tools to construct a standard moth brain for neural network simulations. Comput. Intell. Neurosci. 2012, e795291 (2012).

    Google Scholar 

  62. Mumm, J. S. et al. In vivo imaging reveals dendritic targeting of laminated afferents by zebrafish retinal ganglion cells. Neuron 52, 609–621 (2006).

    CAS  PubMed  PubMed Central  Google Scholar 

  63. Yoshimatsu, T. et al. Transmission from the dominant input shapes the stereotypic ratio of photoreceptor inputs onto horizontal cells. Nat. Commun. 5, 3699 (2014).

    CAS  PubMed  Google Scholar 

  64. Bleckert, A., Schwartz, G. W., Turner, M. H., Rieke, F. & Wong, R. O. L. Visual space is represented by nonmatching topographies of distinct mouse retinal ganglion cell types. Curr. Biol. 24, 310–315 (2014).

    CAS  PubMed  PubMed Central  Google Scholar 

  65. Druckmann, S. et al. Structured synaptic connectivity between hippocampal regions. Neuron 81, 629–640 (2014).

    CAS  PubMed  Google Scholar 

  66. Prönneke, A. et al. Characterizing VIP neurons in the barrel cortex of VIPcre/tdTomato mice reveals layer-specific differences. Cereb. Cortex 25, 4854–4868 (2015).

    PubMed  PubMed Central  Google Scholar 

  67. Peter, M. et al. Transgenic mouse models enabling photolabeling of individual neurons in vivo. PLoS One 8, e62132 (2013).

    CAS  PubMed  PubMed Central  Google Scholar 

  68. Gao, Y., Liu, L., Li, Q. & Wang, Y. Differential alterations in the morphology and electrophysiology of layer II pyramidal cells in the primary visual cortex of a mouse model prenatally exposed to LPS. Neurosci. Lett. 591, 138–143 (2015).

    CAS  PubMed  Google Scholar 

  69. Chen, H. et al. Fast assembling of neuron fragments in serial 3D sections. Brain Inform. 4, 183–186 (2017).

    PubMed  PubMed Central  Google Scholar 

  70. Brito, J. et al. Neuronize: a tool for building realistic neuronal cell morphologies. Front. Neuroanat. 7, 15 (2013).

    PubMed  PubMed Central  Google Scholar 

  71. Shi, Y., Kirwan, P., Smith, J., Robinson, H. P. C. & Livesey, F. J. Human cerebral cortex development from pluripotent stem cells to functional excitatory synapses. Nat. Neurosci. 15, 477–486 (2012).

    CAS  PubMed  Google Scholar 

  72. Wang, Y. et al. Intense and specialized dendritic localization of the fragile X mental retardation protein in binaural brainstem neurons: a comparative study in the alligator, chicken, gerbil, and human. J. Comp. Neurol. 522, 2107–2128 (2014).

    CAS  PubMed  PubMed Central  Google Scholar 

  73. He, H.-Y., Shen, W., Hiramoto, M. & Cline, H. T. Experience-dependent bimodal plasticity of inhibitory neurons in early development. Neuron 90, 1203–1214 (2016).

    CAS  PubMed  PubMed Central  Google Scholar 

  74. Bray, M.-A. & Carpenter, A. E. Quality control for high-throughput imaging experiments using machine learning in cellprofiler. In High Content Screening: A Powerful Approach to Systems Cell Biology and Phenotypic Drug Discovery (eds Johnston, P. A. & Trask, O. J.) 89–112 (Springer, 2018).

  75. Homan, A. C., van Knippenberg, D., Van Kleef, G. A. & De Dreu, C. K. W. Bridging faultlines by valuing diversity: diversity beliefs, information elaboration, and performance in diverse work groups. J. Appl. Psychol. 92, 1189–1199 (2007).

    PubMed  Google Scholar 

  76. Peng, H., Long, F. & Myers, G. Automatic 3D neuron tracing using all-path pruning. Bioinformatics 27, i239–i247 (2011).

    CAS  PubMed  PubMed Central  Google Scholar 

  77. Yang, J. et al. FMST: an automatic neuron tracing method based on fast marching and minimum spanning tree. Neuroinformatics 17, 185–196 (2019).

    PubMed  Google Scholar 

  78. Gu, L. & Cheng, L. Learning to boost filamentary structure segmentation. In 2015 IEEE International Conference on Computer Vision (ICCV) 639–647 (2015). https://doi.org/10.1109/ICCV.2015.80

  79. Wan, Z., He, Y., Hao, M., Yang, J. & Zhong, N. M-AMST: an automatic 3D neuron tracing method based on mean shift and adapted minimum spanning tree. BMC Bioinformatics 18, 197 (2017).

    PubMed  PubMed Central  Google Scholar 

  80. Wu, J. et al. 3D BrainCV: simultaneous visualization and analysis of cells and capillaries in a whole mouse brain with one-micron voxel resolution. Neuroimage 87, 199–208 (2014).

    PubMed  Google Scholar 

  81. Lee, P.-C., Chuang, C.-C., Chiang, A.-S. & Ching, Y.-T. High-throughput computer method for 3D neuronal structure reconstruction from the image stack of the Drosophila brain and its applications. PLoS Comput. Biol. 8, e1002658 (2012).

    CAS  PubMed  PubMed Central  Google Scholar 

  82. Quan, T. et al. NeuroGPS: automated localization of neurons for brain circuits using L1 minimization model. Sci. Rep. 3, 1414 (2013).

    CAS  PubMed  PubMed Central  Google Scholar 

  83. Zhao, T., Olbris, D. J., Yu, Y. & Plaza, S. M. NeuTu: software for collaborative, large-scale, segmentation-based connectome reconstruction. Front. Neural Circuits 12, 101 (2018).

    CAS  PubMed  PubMed Central  Google Scholar 

  84. Bas, E. & Erdogmus, D. Principal curves as skeletons of tubular objects: locally characterizing the structures of axons. Neuroinformatics 9, 181–191 (2011).

    PubMed  Google Scholar 

  85. Sironi, A., Turetken, E., Lepetit, V. & Fua, P. Multiscale centerline detection. IEEE Trans. Pattern Anal. Mach. Intell. 38, 1327–1341 (2016).

    PubMed  Google Scholar 

  86. Peng, H., Ruan, Z., Atasoy, D. & Sternson, S. Automatic reconstruction of 3D neuron structures using a graph-augmented deformable model. Bioinformatics 26, i38–i46 (2010).

    CAS  PubMed  PubMed Central  Google Scholar 

  87. Liu, S. et al. Rivulet: 3D neuron morphology tracing with iterative back-tracking. Neuroinformatics 14, 387–401 (2016).

    PubMed  Google Scholar 

  88. Minemoto, T. et al. SIGEN: system for reconstructing three-dimensional structure of insect neurons. In Proceedings of the Asia Simulation Conference, JSST2009, CDROM 1–6 (2009).

  89. Yang, J., Gonzalez-Bellido, P. T. & Peng, H. A distance-field based automatic neuron tracing method. BMC Bioinformatics 14, 93 (2013).

    PubMed  PubMed Central  Google Scholar 

  90. Chen, H., Xiao, H., Liu, T. & Peng, H. SmartTracing: self-learning-based neuron reconstruction. Brain Inform. 2, 135–144 (2015).

    PubMed  PubMed Central  Google Scholar 

  91. Zhou, Z., Liu, X., Long, B. & Peng, H. TReMAP: automatic 3D neuron reconstruction based on tracing, reverse mapping and assembling of 2D projections. Neuroinformatics 14, 41–50 (2016).

    PubMed  Google Scholar 

  92. Prim, R. C. Shortest connection networks and some generalizations. The Bell System Technical Journal 36, 1389–1401 (1957).

    Google Scholar 

  93. Manubens-Gil, L. BigNeuron fMOST showcase image data. https://doi.org/10.5281/zenodo.7556104 (2023).

  94. Manubens-Gil, L. lmanubens/BigNeuron: BigNeuron Shiny app v1.0.0 code base. https://doi.org/10.5281/ZENODO.7556112 (2023).

Download references

Acknowledgements

This project was supported by Allen Institute for initialization and a series of events, and a DOE Oak Ridge National Laboratory Leadership Computing award to H.P.; the cross-platform bench-test was also supported intensively at Lawrence Berkeley National Lab, Allen Institute for Brain Science, Blue Brain Project at EPFL, Southeast University at Nanjing, and various other facilities. This project also used large-scale display wall and immersive virtual reality facilities at Imperial College of London, Oak Ridge National Laboratory, Janelia Research Campus of HHMI, and Southeast University. This project was also supported by Tencent Inc. and Southeast University for interactive online analysis. Much of the work of this project reported here is a joint community effort with many sponsors over 7 years. The authors acknowledge Prabhat, Z. Wan, J. Yang, H. Zhou, A. Narayanaswamy, S. Zeng, P. Glowacki, D. Jin, Z. Zheng, P. Hong, T. Zeng, R. Li, H. Ikeno, Y.-T. Ching, T. Quan, J.-F. Evers, C. Murtin, S. Gao, Y. Zhu, Y. Yang, H. Ai, S. Ishii, E. Hottendorf, T. Kawase, H. Dong and a number of other colleagues for assistance in developing and porting the automated reconstruction algorithms, and discussion. The authors thank S. Sorensen for assistance in discussing and tracing some gold standard neuron reconstructions, and R. Kanzaki (University of Tokyo), D. Miyamoto (University of Tokyo), R. Wong (University of Washington), Y. Wang (Allen Institute for Brain Science), E. Lein (Allen Institute for Brain Science), C. Bass (King’s College London), S. Danzer (Cincinnati Children’s Hospital) and many other colleagues for providing neuron image datasets. The authors also acknowledge A. Jones for suggesting the project name and support throughout the project, and thank R. Yuste and D. Van Essen for discussion; J. Isaac and K. Moses for support from the Wellcome Trust to organize the University of Cambridge hackathon; G. Rubin and N. Spruston for support, data sharing and event organization at the Janelia Research Campus of HHMI; INCF for financial support for organization of a series of meetings; and Beijing University of Technology, Imperial College of London and Southeast University for support of hackathons and workshops. The authors also thank X. Zhao for assistance in hosting the Shiny app in the neuroXiv website; P. Qian, Z. Zhao and X. Chen for assistance in formatting the manuscript; and A. Carpenter for discussion in organizing a tracing hackathon. The authors acknowledge the support of the National Science and Technology Innovation 2030 – ‘Brain Science and Brain-Inspired Research’ Program of China (Grant 2021ZD0204002 and 2022ZD0205200 to H.P, L.M.-G., Y. Liu, Z.R.). B.Y. was supported by the NIH grant R01 EB028159. G.A.A. was supported by NIH grants R01 NS39600, RF1 MH128693 and R01 NS86082. A.-S.C. was supported by the Higher Education Sprout Project co-funded by the Ministry of Education and the Ministry of Science and Technology in Taiwan. M.S. was supported by the European Programs JPND TransPathND (ANR-17-JPCD-0002), EuroNanoMed III MoDiaNo (ANR-18-ENM3-0002) and CNES. L.G. was supported by JST Moonshot R&D Grant Number JPMJMS2011, Japan. The authors acknowledge the National Center for High-Performance Computing in Taiwan for managing the FlyCircuit data. The funders had no role in the study design, data collection and analysis, decision to publish or preparation of the manuscript.

Author information

Authors and Affiliations

Authors

Contributions

H.P. conceived and conceptualized this project. H.P., G.A.A. and E.M. formed the managing team and communicated with other key investigators including A.R., J. Yao, J.R., K.I., J.K., G.J., P.G.-B., Y.G., N.Z., G.T., S.H., M.H. and C.K. to secure key resources to execute the project. H.P. envisioned and led the development of the computational and data analysis platform. L.M.-G. developed the data analysis for this study. H.C. assisted in hosting the Shiny app in the neuroXiv website. Z.Z., with the help of H.P., bench-tested neuron reconstruction algorithms and generated the automatic morphology reconstructions. X. Liu contributed to the consensus algorithm. Y. Liu bench-tested neuron reconstruction algorithms in the fMOST datasets. H.C. and H.P. color-separated single neurons from fruitfly brain images generated by A.N. and colleagues in Gerald Rubin lab and the Janelia FlyLight Project Team. H.C., S.N., A.N. and C.D. participated in tracing hackathons and contributed to the generation of the gold standard test dataset. Y.W., E.R., L.M., B.Y., H.Z., H.T.C., A.-S.C., J.F.S., M.P., R.L., D.N.C., H.-Y.H, M.S., K.I., J.K., G.J., P.G.-B. and M.O. contributed testing neuron images. A.B., T.G., Z.R., X. Li, Y. Li, K.E.B., L.G., L.C., J. Yang, L.Q., S.L., H.Y., W.C., S.J., B.R., C.-W.W., A.S., P.F., M.R., T.Z., D.M.I., J.Z., T.L., E.B., E.C.-S., P.A. and J.S. contributed reconstruction algorithms. L.M.-G., G.A.A., E.M. and H.P. wrote the manuscript with assistance from co-authors.

Corresponding authors

Correspondence to Erik Meijering, Giorgio A. Ascoli or Hanchuan Peng.

Ethics declarations

Competing interests

Z.Z. is employed by Microsoft. The company did not influence the research. H.C. and J. Yao are employed by Tencent AI Lab. Tencent provides the initial hosting of the Shiny website. The company does not expect to benefit financially. X. Liu is employed by Kaya Medical. The company did not influence the research. M.R. is employed by Nuctech. The company did not influence the research. S.L. is employed by Paige AI. The company did not influence the research. A.S. is employed by PROPHESEE. The company did not influence the research. E.B. is employed by AWS AI. The company did not influence the research. All other authors have no competing interests.

Peer review

Peer review information

Nature Methods thanks Chao Chen and the other, anonymous, reviewers for their contribution to the peer review of this work. Primary Handling Editor: Nina Vogt, in collaboration with the Nature Methods team.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Extended data

Extended Data Fig. 1 A web app to allow interactive navigation of heterogeneous bench-testing results.

Visualization of the Shiny interactive web app (https://linusmg.shinyapps.io/BigNeuron_Gold166/ and https://neuroxiv.net/bigneuron/). The data loaded into the app includes the dataset images, gold standard annotations, automatic reconstructions, and metadata associated with each dataset. Users can interactively choose the image quality and tree morphology metrics used for dimensionality reduction and cluster analysis, and perform reconstruction quality benchmarking. Documentation for the usage of the app can be found at https://github.com/lmanubens/BigNeuron.

Extended Data Fig. 2 Bayesian Information Criterion (BIC) for parametrized Gaussian Mixture models fitted by the expectation-maximization algorithm.

Each colored symbol indicates the BIC for a given mixture model with a number of components specified in the x axis. ‘EII’: spherical, equal volume; ‘VII’: spherical, unequal volume; ‘EEE’: ellipsoidal, equal volume, shape, and orientation; ‘EEV’: ellipsoidal, equal volume and equal shape. The dashed light blue line indicates the maximum BIC. The Bayesian Information Criterion is a measure for the comparative evaluation among a finite set of statistical models, the measure is based on maximizing the likelihood function while penalizing for the number of parameters in the models36.

Source data

Extended Data Fig. 3 Overall benchmark of best-performing algorithms.

a Number of images in which specific automatic tracing algorithms outperform the others. For each image, the algorithm having the smallest average bi-directional entire structure average distance against the gold standard was considered the best. The number of times each algorithm was found to be best is shown as a bar plot. The number of times each algorithm produced a result in the full Gold166 dataset is indicated in parentheses in the labels. b Overall benchmark of all algorithms accounting for all measured distances to gold standards with an aggregated metric. Mean + /− Standard Errors are shown as bar plots. Each dot represents the distance quantification for each neuron. The number of times each algorithm produced a result in the full Gold166 dataset is indicated in parentheses in the labels.

Source data

Extended Data Fig. 4 Supplementary benchmarks of best-performing algorithms.

Benchmarks of the 6 overall best-performing algorithms based on Extended Data Fig. 3a for subsets of Gold166 with different CNR based on the Otsu threshold. Means + /− Standard Errors are presented as bar plots. Each dot represents the distance quantification for each neuron. The number of times each algorithm produced a result in the full Gold166 dataset is indicated in parentheses in the labels.

Source data

Extended Data Fig. 5 Image quality metrics correlate with the accuracy of automated tracing.

Hierarchical clustering among image quality metrics, tree morphological features, and reconstruction quality. Reconstruction quality correlates with a set of features, indicating that more focused images of big neurons tend to provide better automatic reconstruction results. a The heatmap indicates color-coded pairwise Pearson correlations between metrics obtained for consensus tree reconstructions. b–d Correlation plots for image quality and dendritic tree morphology features (B: Focus Score in SWC nodes, C: parent–daughter ratio, and D: bifurcation angle remote) and consensus reconstruction quality (% of different structure). P values indicate the result of two-sided tests for correlation.

Source data

Extended Data Fig. 6 Gold166 subset that most closely resembles fMOST data according to image quality features.

a Principal Component Analysis of gold standard datasets accounting for their image quality metrics. Each point is one gold standard image volume, and the color indicates the dataset it comes from. Arrows represent the direction of each variable in the PCA space. Longer arrows belong to variables that are well represented by the two principal components. Given that 68% of the density of multi variate normal distributions are within 1 Mahalanobis distance of the mean, 68% confidence normal data ellipses for each group are drawn with solid lines. b Percentage of different structure between automatic reconstructions and gold standard trees for the Zebrafish larvae RGC neurons. Mean + /− Standard Errors of percentage of different structure are shown as bar plots.

Source data

Supplementary information

Supplementary Information

Supplementary Notes, Supplementary Figs. 1, 2

Reporting Summary

Supplementary Tables

Supplementary Table 1. Datasets included in the gold standard Gold166. Descriptive metadata for imaging datasets in Gold166. Supplementary Table 2. Reconstruction algorithms used in BigNeuron. The binary release v3.200 of Vaa3D contains functional binaries of the plugins. The source code is available in the Vaa3D GitHub repository. Supplementary Table 3. Summary of neuromorphological, reconstruction quality and image quality features used. The table includes a brief description and their units when applicable.

Source data

Source Data Fig. 2

Statistical Source data

Source Data Fig. 3

Statistical Source data

Source Data Fig. 4

Statistical Source data

Source Data Fig. 5

Statistical Source data

Source Data Extended Data Fig. 2

Statistical Source data

Source Data Extended Data Fig. 3

Statistical Source data

Source Data Extended Data Fig. 4

Statistical Source data

Source Data Extended Data Fig. 5

Statistical Source data

Source Data Extended Data Fig. 6

Statistical Source data

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Manubens-Gil, L., Zhou, Z., Chen, H. et al. BigNeuron: a resource to benchmark and predict performance of algorithms for automated tracing of neurons in light microscopy datasets. Nat Methods 20, 824–835 (2023). https://doi.org/10.1038/s41592-023-01848-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41592-023-01848-5

This article is cited by

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing