Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Technical Report
  • Published:

Dense neuronal reconstruction through X-ray holographic nano-tomography

Abstract

Imaging neuronal networks provides a foundation for understanding the nervous system, but resolving dense nanometer-scale structures over large volumes remains challenging for light microscopy (LM) and electron microscopy (EM). Here we show that X-ray holographic nano-tomography (XNH) can image millimeter-scale volumes with sub-100-nm resolution, enabling reconstruction of dense wiring in Drosophila melanogaster and mouse nervous tissue. We performed correlative XNH and EM to reconstruct hundreds of cortical pyramidal cells and show that more superficial cells receive stronger synaptic inhibition on their apical dendrites. By combining multiple XNH scans, we imaged an adult Drosophila leg with sufficient resolution to comprehensively catalog mechanosensory neurons and trace individual motor axons from muscles to the central nervous system. To accelerate neuronal reconstructions, we trained a convolutional neural network to automatically segment neurons from XNH volumes. Thus, XNH bridges a key gap between LM and EM, providing a new avenue for neural circuit discovery.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: XNH technique and characterization.
Fig. 2: Correlative XNH–EM analysis of the connectivity statistics of pyramidal apical dendrites in the posterior parietal cortex.
Fig. 3: Millimeter-scale imaging of a Drosophila leg at single-neuron resolution.
Fig. 4: Automated segmentation of neuronal morphologies using CNNs.

Similar content being viewed by others

Data and materials availability

Raw XNH image data from this study are available in the following publicly accessible repositories:

1. BossDB (https://bossdb.org/)

https://bossdb.org/project/kuan_phelps2020

2. WebKnossos (https://webknossos.org/)

https://wklink.org/8122 (XNH_ESRF_mouseCortex_30nm)

https://wklink.org/7283 (XNH_ESRF_mouseCortex_40nm)

https://wklink.org/9034 (XNH_ESRF_drosophilaBrain_120nm)

https://wklink.org/6724 (XNH_ESRF_drosophilaVNC_50nm)

https://wklink.org/8452 (XNH_ESRF_drosophilaLeg_75nm)

3. ESRF (https://data.esrf.fr/public/10.15151/ESRF-DC-217728238) (anonymous login)

DOI: doi.esrf.fr/10.15151/ESRF-DC-217728238

Source data for Fig. 2e,f,h,i are provided with the paper.

See https://lee.hms.harvard.edu/resources for access to skeleton reconstructions via CATMAID.

Other datasets, as well as the fly reporter line for nuclear-targeted APEX2 expression, are available from the corresponding authors upon reasonable request. Please contact joitapac@esrf.eu or wei-chung_lee@hms.harvard.edu.

Software and code availability

Code is available as described below or from the corresponding authors upon reasonable request. Please contact joitapac@esrf.eu or wei-chung_lee@hms.harvard.edu.

Data collection

X-ray holographic nano-tomography data were acquired using custom code based on TANGO (https://www.tango-controls.org/about-us/#mission) and SPEC (https://www.certif.com/content/spec/) software packages.

Data pre-processing

X-ray holographic nano-tomography data were reconstructed using custom code written in Octave (https://www.gnu.org/software/octave/) and the PyHST2 software package (https://software.pan-data.eu/software/74/pyhst2). Stitching of XNH image volumes was performed using ImageJ (https://imagej.net, v. 1.52p) with the BigWarp and BigStitcher plugins. Alignment of serial EM images was performed with AlignTK (https://mmbios.pitt.edu/software#aligntk, v. 1.0.2). XNH and EM data were aligned to each other using BigWarp (https://imagej.net/BigWarp).

Data analysis

FSC analysis was performed using custom code (https://github.com/jcesardasilva/toupy/tree/master/toupy/resolution). Manual data segmentation of XNH and EM images was performed using ITK-SNAP (www.itksnap.org, v. 3.6.0). Manual data annotation (tracing) was performed using CATMAID (https://catmaid.readthedocs.io, v. 2018.11.09-682-g811c25a) and queried using the pyMaid API (https://pymaid.readthedocs.io, https://github.com/schlegelp/pyMaid, v. 1.1.2). Neuron segmentation was performed using a custom CNN pipeline (see Methods). Ground truth training data were prepared using Armitage (Google internal development software) and CATMAID. Neurons were reconstructed from segmentation data using Neuroglancer (https://github.com/google/neuroglancer, v. 1.1.5). Source data are provided with this paper.

References

  1. Denk, W., Briggman, K. L. & Helmstaedter, M. Structural neurobiology: missing link to a mechanistic understanding of neural computation. Nat. Rev. Neurosci. 13, 351–358 (2012).

    CAS  PubMed  Google Scholar 

  2. Xu, C. S. et al. Enhanced FIB-SEM systems for large-volume 3D imaging. eLife. 6, e25916 (2017).

  3. Lee, W.-C. A. et al. Anatomy and function of an excitatory network in the visual cortex. Nature 532, 1–18 (2016).

    Google Scholar 

  4. Zheng, Z. et al. A complete electron microscopy volume of the brain of adult Drosophila melanogaster. Cell 174, 730–743 (2018).

    CAS  PubMed  PubMed Central  Google Scholar 

  5. Hell, S. W. Far-field optical nanoscopy. Science 316, 1153–1158 (2007).

    CAS  PubMed  Google Scholar 

  6. Huang, B., Bates, M. & Zhuang, X. Super-resolution fluorescence microscopy. Annu. Rev. Biochem. 78, 993–1016 (2009).

    CAS  PubMed  PubMed Central  Google Scholar 

  7. Chen, F., Tillberg, P. W. & Boyden, E. S. Expansion microscopy. Science 347, 543–548 (2015).

    CAS  PubMed  PubMed Central  Google Scholar 

  8. Gao, R. et al. Cortical column and whole-brain imaging with molecular contrast and nanoscale resolution. Science 363, eaau8302 (2019).

  9. Mizutani, R., Saiga, R., Takeuchi, A., Uesugi, K. & Suzuki, Y. Three-dimensional network of Drosophila brain hemisphere. J. Struct. Biol. 184, 271–279 (2013).

    PubMed  Google Scholar 

  10. Schulz, G. et al. High-resolution tomographic imaging of a human cerebellum: comparison of absorption and grating-based phase contrast. J. R. Soc. Interface 7, 1665–1676 (2010).

    PubMed  PubMed Central  Google Scholar 

  11. Weitkamp, T. et al. X-ray phase imaging with a grating interferometer. Opt. Express 13, 6296 (2005).

    PubMed  Google Scholar 

  12. Pfeiffer, F. et al. High-resolution brain tumor visualization using three-dimensional X-ray phase contrast tomography. Phys. Med. Biol. 52, 6923–6930 (2007).

    CAS  PubMed  Google Scholar 

  13. Shahmoradian, S. H. et al. Three-dimensional imaging of biological tissue by cryo X-ray ptychography. Sci. Rep. 7, 6291 (2017).

    CAS  PubMed  PubMed Central  Google Scholar 

  14. Dierolf, M. et al. Ptychographic X-ray computed tomography at the nanoscale. Nature 467, 436–439 (2010).

    CAS  PubMed  Google Scholar 

  15. Dyer, E. L. et al. Quantifying mesoscale neuroanatomy using X-ray microtomography. eNeuro 4, ENEURO.0195-17.2017 (2017).

  16. Fonseca, M. et al. High-resolution synchrotron-based X-ray microtomography as a tool to unveil the three-dimensional neuronal architecture of the brain. Sci. Rep. 8, 12074 (2018).

    PubMed  PubMed Central  Google Scholar 

  17. Khimchenko, A. et al. Hard X-ray nanoholotomography: large-scale, label-free, 3D neuroimaging beyond optical limit. Adv. Sci. 5, 1700694 (2018).

    Google Scholar 

  18. Massimi, L. et al. Exploring Alzheimer’s disease mouse brain through X-ray phase contrast tomography: from the cell to the organ. Neuroimage 184, 490–495 (2019).

  19. Töpperwien, M., van der Meer, F., Stadelmann, C. & Salditt, T. Three-dimensional virtual histology of human cerebellum by X-ray phase-contrast tomography. Proc. Natl Acad. Sci. USA 115, 6940–6945 (2018).

    PubMed  Google Scholar 

  20. Cedola, A. et al. X-ray phase contrast tomography reveals early vascular alterations and neuronal loss in a multiple sclerosis model. Sci. Rep. 7, 5890 (2017).

    CAS  PubMed  PubMed Central  Google Scholar 

  21. Cloetens, P. et al. Holotomography: quantitative phase tomography with micrometer resolution using hard synchrotron radiation x rays. Appl. Phys. Lett. 75, 2912–2914 (1999).

    CAS  Google Scholar 

  22. Ng, J. et al. Genetically targeted 3D visualisation of Drosophila neurons under electron microscopy and X-ray microscopy using miniSOG. Sci. Rep. 6, 38863 (2016).

    CAS  PubMed  PubMed Central  Google Scholar 

  23. da Silva, J. C. et al. Efficient concentration of high-energy x-rays for diffraction-limited imaging resolution. Optica 4, 492 (2017).

    Google Scholar 

  24. Hubert, M. et al. Efficient correction of wavefront inhomogeneities in X-ray holographic nanotomography by random sample displacement. Appl. Phys. Lett. 112, 203704 (2018).

    Google Scholar 

  25. Harauz, G. & van Heel, M. Exact filters for general geometry three dimensional reconstruction. Optik 73, 146–156 (1986).

    Google Scholar 

  26. Spruston, N. Pyramidal neurons: dendritic structure and synaptic integration. Nat. Rev. Neurosci. 9, 206–221 (2008).

    CAS  PubMed  Google Scholar 

  27. Karimi, A., Odenthal, J., Drawitsch, F., Boergens, K. M. & Helmstaedter, M. Cell-type specific innervation of cortical pyramidal cells at their apical dendrites. eLife 9, e46876 (2020).

  28. Maniates-Selvin, J. T. et al. Reconstruction of motor control circuits in adult Drosophila using automated transmission electron microscopy. Preprint at https://www.biorxiv.org/content/10.1101/2020.01.10.902478v1 (2020).

  29. Harvey, C. D., Coen, P. & Tank, D. W. Choice-specific sequences in parietal cortex during a virtual-navigation decision task. Nature 484, 62–68 (2012).

    CAS  PubMed  PubMed Central  Google Scholar 

  30. Peters, A., Palay, S. L. & Webster, H. deF. The Fine Structure of the Nervous System: Neurons and Their Supporting Cells (Oxford University Press, 1991).

  31. Mamiya, A., Gurung, P. & Tuthill, J. C. Neural coding of leg proprioception in Drosophila. Neuron 100, 636–650 (2018).

    CAS  PubMed  PubMed Central  Google Scholar 

  32. Tuthill, J. C. & Wilson, R. I. Mechanosensation and adaptive motor control in insects. Curr. Biol. 26, R1022–R1038 (2016).

    CAS  PubMed  PubMed Central  Google Scholar 

  33. Tuthill, J. C. & Wilson, R. I. Parallel transformation of tactile signals in central circuits of Drosophila. Cell 164, 1046–1059 (2016).

    CAS  PubMed  PubMed Central  Google Scholar 

  34. Merritt, D. J. & Murphey, R. K. Projections of leg proprioceptors within the CNS of the fly Phormia in relation to the generalized insect ganglion. J. Comp. Neurol. 322, 16–34 (1992).

    CAS  PubMed  Google Scholar 

  35. Desai, B. S., Chadha, A. & Cook, B. The stum gene is essential for mechanical sensing in proprioceptive neurons. Science 343, 1256–1259 (2014).

    PubMed  Google Scholar 

  36. Perge, J. A., Niven, J. E., Mugnaini, E., Balasubramanian, V. & Sterling, P. Why do axons differ in caliber? J. Neurosci. 32, 626–638 (2012).

    CAS  PubMed  PubMed Central  Google Scholar 

  37. Faisal, A. A. & Laughlin, S. B. Stochastic simulations on the reliability of action potential propagation in thin axons. PLoS Comput. Biol. 3, 0783–0795 (2007).

    CAS  Google Scholar 

  38. Soler, C., Daczewska, M., Da Ponte, J. P., Dastugue, B. & Jagla, K. Coordinated development of muscles and tendons of the Drosophila leg. Development 131, 6041–6051 (2004).

    CAS  PubMed  Google Scholar 

  39. Azevedo, A. W. et al. A size principle for recruitment of Drosophila leg motor neurons. eLife 9, e56754 (2020).

  40. Baek, M. & Mann, R. S. Lineage and birth date specify motor neuron targeting and dendritic architecture in adult Drosophila. J. Neurosci. 29, 6904–6916 (2009).

    CAS  PubMed  PubMed Central  Google Scholar 

  41. Funke, J. et al. Large scale image segmentation with structured loss based deep learning for connectome reconstruction. IEEE Trans. Pattern Anal. Mach. Intell. 41, 1669–1680 (2019).

    PubMed  Google Scholar 

  42. Brierley, D. J., Rathore, K., VijayRaghavan, K. & Williams, D. W. Developmental origins and architecture of Drosophila leg motoneurons. J. Comp. Neurol. 520, 1629–1649 (2012).

    CAS  PubMed  Google Scholar 

  43. Tsubouchi, A. et al. Topological and modality-specific representation of somatosensory information in the fly brain. Science 358, 615–623 (2017).

    CAS  PubMed  Google Scholar 

  44. Januszewski, M. & Jain, V. Segmentation-enhanced CycleGAN. Preprint at bioRxiv https://www.biorxiv.org/content/10.1101/548081v1 (2019).

  45. Schneider-Mizell, C. M. et al. Quantitative neuroanatomy for connectomics in Drosophila. eLife 5, 1133–1145 (2016).

    Google Scholar 

  46. Chiang, A.-S. et al. Three-dimensional reconstruction of brain-wide wiring networks in Drosophila at single-cell resolution. Curr. Biol. 21, 1–11 (2011).

    CAS  PubMed  Google Scholar 

  47. Hildebrand, D. G. C. et al. Whole-brain serial-section electron microscopy in larval zebrafish. Nature 545, 345–349 (2017).

    CAS  PubMed  PubMed Central  Google Scholar 

  48. Momose, A., Takeda, T., Itai, Y. & Hirano, K. Phase-contrast X-ray computed tomography for observing biological soft tissues. Nat. Med. 2, 473–475 (1996).

    CAS  PubMed  Google Scholar 

  49. Denk, W. & Horstmann, H. Serial block-face scanning electron microscopy to reconstruct three-dimensional tissue nanostructure. PLoS Biol. 2, e329 (2004).

    PubMed  PubMed Central  Google Scholar 

  50. Diaz, A. et al. Three-dimensional mass density mapping of cellular ultrastructure by ptychographic X-ray nanotomography. J. Struct. Biol. 192, 461–469 (2015).

    CAS  PubMed  Google Scholar 

  51. Hua, Y., Laserstein, P. & Helmstaedter, M. Large-volume en-bloc staining for electron microscopy-based connectomics. Nat. Commun. 6, 7923 (2015).

    CAS  PubMed  PubMed Central  Google Scholar 

  52. Zhang, Q., Lee, W.-C. A., Paul, D. L. & Ginty, D. D. Multiplexed peroxidase-based electron microscopy labeling enables simultaneous visualization of multiple cell types. Nat. Neurosci. 22, 828–839 (2019).

    CAS  PubMed  PubMed Central  Google Scholar 

  53. Du, M. & Jacobsen, C. Relative merits and limiting factors for x-ray and electron microscopy of thick, hydrated organic materials. Ultramicroscopy 184, 293–309 (2018).

    CAS  PubMed  Google Scholar 

  54. Lam, S. S. et al. Directed evolution of APEX2 for electron microscopy and proximity labeling. Nat. Methods 12, 51–54 (2015).

    CAS  PubMed  Google Scholar 

  55. Berg, S. et al. ilastik: interactive machine learning for (bio)image analysis. Nat. Methods 16, 1226–1232 (2019).

  56. Villar, F. et al. Nanopositioning for the ESRF ID16A nano-imaging beamline. Synchrotron Radiat. N. 31, 9–14 (2018).

    Google Scholar 

  57. Labiche, J.-C. et al. Invited article: the fast readout low noise camera as a versatile x-ray detector for time resolved dispersive extended x-ray absorption fine structure and diffraction studies of dynamic problems in materials science, chemistry, and catalysis. Rev. Sci. Instrum. 78, 091301 (2007).

    PubMed  Google Scholar 

  58. Mokso, R., Cloetens, P., Maire, E., Ludwig, W. & Buffière, J.-Y. Nanoscale zoom tomography with hard x rays using Kirkpatrick–Baez optics. Appl. Phys. Lett. 90, 144104 (2007).

    Google Scholar 

  59. Yu, B. et al. Evaluation of phase retrieval approaches in magnified X-ray phase nano computerized tomography applied to bone tissue. Opt. Express 26, 11110 (2018).

    CAS  PubMed  Google Scholar 

  60. Cloetens, P., Barrett, R., Baruchel, J. E., Guigay, J.-P. & Schlenker, M. Phase objects in synchrotron radiation hard x-ray imaging. J. Phys. D: Appl. Phys. 29, 133–146 (1996).

    CAS  Google Scholar 

  61. Paganin, D., Mayo, S. C., Gureyev, T. E., Miller, P. R. & Wilkins, S. W. Simultaneous phase and amplitude extraction from a single defocused image of a homogeneous object. J. Microsc. 206, 33–40 (2002).

    CAS  PubMed  Google Scholar 

  62. Mirone, A., Brun, E., Gouillart, E., Tafforeau, P. & Kieffer, J. The PyHST2 hybrid distributed code for high speed tomographic reconstruction with iterative reconstruction and a priori knowledge capabilities. Nucl. Instrum. Methods Phys. Res. Sect. B Beam Interact. Mater. At. 324, 41–48 (2014).

    CAS  Google Scholar 

  63. van Heel, M. & Schatz, M. Fourier shell correlation threshold criteria. J. Struct. Biol. 151, 250–262 (2005).

    PubMed  Google Scholar 

  64. Saalfeld, S., Cardona, A., Hartenstein, V. & Tomancak, P. CATMAID: collaborative annotation toolkit for massive amounts of image data. Bioinformatics 25, 1984–1986 (2009).

    CAS  PubMed  PubMed Central  Google Scholar 

  65. Yushkevich, P. A. et al. User-guided 3D active contour segmentation of anatomical structures: significantly improved efficiency and reliability. Neuroimage 31, 1116–1128 (2006).

    PubMed  Google Scholar 

  66. Graham, B. J. et al. High-throughput transmission electron microscopy with automated serial sectioning. Preprint at bioRxiv https://www.biorxiv.org/content/10.1101/657346v1.full (2019).

  67. Bogovic, J. A., Hanslovsky, P., Wong, A. & Saalfeld, S. Robust registration of calcium images by learned contrast synthesis. In Proc. 13th International Symposium on Biomedical Imaging 1123–1126 (Institute of Electrical and Electronics Engineers, 2016).

  68. Hörl, D. et al. BigStitcher: reconstructing high-resolution image datasets of cleared and expanded samples. Nat. Methods 16, 870–874 (2019).

    PubMed  Google Scholar 

  69. Peters, A. & Kara, D. A. The neuronal composition of area 17 of rat visual cortex. I. The pyramidal cells. J. Comp. Neurol. 234, 218–241 (1985).

    CAS  PubMed  Google Scholar 

  70. Peters, A. & Kara, D. A. The neuronal composition of area 17 of rat visual cortex. II. The nonpyramidal cells. J. Comp. Neurol. 234, 242–263 (1985).

    CAS  PubMed  Google Scholar 

Download references

Acknowledgements

The authors thank T. Pedersen for pre-processing and alignment of image data; R. Xu, T. Pedersen and L. DeCoursey for neuron tracing; J. Shin, W.-W. Lou, M. Liu, Y. Hu and R. Xu for manual annotation of ground truth segmentation for CNN training; J. da Silva for providing code and assistance for FSC measurements; N. Perrimon, M. Pecot and H. Lacin for providing fly lines; R. Fetter and A. Thompson for sample preparation advice; P. Li and V. Jain for access to and support with Armitage software; Y. Hu and M. Osman for contributing code for segmentation accuracy measurements; R. Wilson and H. Somhegyi for discussion and advice; C. Harvey, R. Born, J. Moffitt and R. Wilson for comments on the manuscript; and BossDB (W. Gray-Roncal, D. Xenes and B. Webster) and WebKnossos (N. Rzepka) for hosting image data on public repositories. The authors acknowledge ESRF for granting beamtime for the experiments: LS2705, LS2845, IHLS2928, IHLS3121, IHHC3498, IHMA7 and IHLS3004. This work was supported by the National Institutes of Health (R01NS108410) and awards from the Edward R. and Anne G. Lefler Center and the Goldenson Family to W.C.A.L. A.P. has received funding from the European Research Council under the European Union’s Horizon 2020 Research and Innovation Programme (grant no. 852455).

Author information

Authors and Affiliations

Authors

Contributions

A.T.K., J.S.P., J.C.T., A.P. and W.C.A.L conceptualized the project and designed experiments. J.S.P. and A.T.K. prepared samples. C.-L.C. built transgenic fly lines. P.C. and A.P. designed and built instrumentation and image reconstruction methods. A.P. optimized X-ray imaging and contributed beamtime. A.P., J.S.P. and A.T.K. performed X-ray imaging. A.P., A.T.K., J.S.P. and P.C. performed phase retrieval, tomographic reconstruction and data processing. J.S.P. and A.T.K. performed electron microscopy. A.T.K., J.S.P. and J.H. performed and managed tracing and annotation. J.F., T.M.N., L.A.T., A.T.K. and J.H. adapted and performed automated segmentation and error analyses. A.W.A. and J.C.T. provided light microscopy data of fly neurons. A.T.K., J.S.P., A.P., L.A.T., A.W.A. and J.C.T. analyzed the data. A.T.K., J.S.P., A.P. and W.C.A.L. wrote the paper. All authors assisted in reviewing and revising the manuscript.

Corresponding authors

Correspondence to Alexandra Pacureanu or Wei-Chung Allen Lee.

Ethics declarations

Competing interests

W.C.A.L. declares the following competing interest: Harvard University filed a patent application regarding GridTape (WO2017184621A1) on behalf of the inventors, including W.C.A.L., and negotiated licensing agreements with interested partners. GridTape EM imaging was used for results in Fig. 2.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Extended data

Extended Data Fig. 1 X-ray Holographic Nano-Tomography (XNH) Technique and Characterization.

a, Overview of XNH imaging and preprocessing. Left: Holographic projections of the sample (a result of free-space propagation of the coherent X-ray beam) are recorded for each angle as the sample is rotated over 180°, then normalized with the incoming beam. Center left: Phase projections are calculated by computationally combining four normalized holograms recorded with the sample placed at different distances from the beam focus. Center right: Virtual slices through the 3D image volumes are calculated using tomographic reconstruction. Right: The resulting XNH image volume can be rendered in 3D and analyzed to reveal neuronal morphologies. b, Quantification of resolution of XNH scans measured using Fourier Shell Correlation (FSC), normalized by each scan’s pixel size. At larger pixel sizes, the resolution per pixel improves, though the resolution itself is worse (see Fig. 1h). Datapoints and error bars show mean ± IQR of subvolumes sampled from each XNH scan. Number of subvolumes used for each scan is shown in Supplementary Data Table 1. c, Representative FSC curve shown with the half-bit threshold. The intersection between the FSC curve and the threshold is the measured resolution. d, Quantification of resolution within the 30 nm mouse cortex scan. Each dot represents an FSC measurement of a 100 voxel3 cube. Blue line and shaded band represent binned averages and standard deviation, respectively. The x-axis is the radial position of the center of the cube (distance from the axis of rotation). The red dotted line indicates the boundary of the scan – data points to the right of the line are from extended field of view regions (Methods). e, f, Edge-fitting measurements of spatial resolution. Although FSC is commonly used to quantify resolution in many imaging modalities including X-ray imaging, its implementation is somewhat controversial63. To ensure that FSC measurements were accurate, we also used an independent measure of resolution based on fitting sharp edges in the images (see Methods), which produced values consistent with those measured via FSC. Left: Example features used for edge-fitting resolution measurement. For both (e) mouse cortex and (f) fly central nervous system, mitochondria were primarily selected because they have dark contrast and sharp boundaries. Center: Example line scan (image intensity values along the orange lines in the feature images). The measured resolution is parameterized from a best-fit to a sigmoid function (Methods). Right: Distribution of edge-fitting resolution measurements for many features distributed throughout the image volumes. n = 30 features measured as shown; boxes shows median and IQR and whiskers show range excluding outliers beyond 1.5 IQR from the median. The median resolution measured via FSC is shown for comparison. g, Comparison of edge-fitting resolution measurements for two XNH scans and high-resolution transmission EM images. EM data was acquired from a ~40 nm thick section of Drosophila VNC tissue, imaged with 4 nm pixels. Resolution is plotted in units of pixels. n = 30 features for each dataset; boxes shows median and IQR and whiskers show range excluding outliers beyond 1.5 IQR from the median. h, Comparison of XNH images acquired from the same FOV in the same sample (fly leg) at different voxel sizes. Within this range, the resolution improves monotonically, but not linearly, with voxel size. i, Comparison of XNH and EM segmentations. The XNH and EM images shown in Fig. 1i were independently segmented. Colored patches in the left two images represent different neurons in the segmentation. The EM segmentation was taken as ground truth, and the XNH segmentation for each neuron was evaluated. The right-most image shows correct and incorrectly segmented neurons. j, Quantification of XNH segmentation accuracy. The proportion of correctly segmentation neurons is plotted as a function of neuron size. Neurons larger then 200 nm diameter were segmented correctly more than 50% of the time. Note that this analysis used only 2D image data – additional 3D information would likely improve performance. In addition to the size of the neurons, the membrane contrast is also an important factor in accurately segmenting neurons in XNH. In a few cases, membranes between two axons were not clearly visible in XNH, causing them to be erroneously merged (i). Motor neurons in the leg nerve were also challenging to segment because they contain complex glial wrappings that are not always clearly resolved in XNH (i, right size of images).

Extended Data Fig. 2 Correlative XNH - EM analysis of the connectivity statistics of pyramidal apical dendrites in the posterior parietal cortex (PPC).

a, 3D rendering of two aligned and stitched XNH datasets in the mouse PPC. Cell somata are colored in green (based on voxel brightness). Magenta plane indicates location of serial EM dataset. b, Aligned XNH virtual slice (left) and EM image (right) of the same region of cortical tissue (horizontal section). After XNH imaging and thin sectioning, the ultrastructure of the tissue remains well preserved, allowing identification of synapses (inset right, arrows). The EM images also showed small cracks (orange arrows) and bubbles (inset, pink arrows), which may have been caused by XNH imaging. c, Examples of pyramidal neurons (top), inhibitory interneurons (middle) and glia (bottom) from the XNH data. Cells types were identified by classic ultrastructural features30,69,70. Pyramidal cells were identified by their prominent apical dendrites, while glia were identified from the relative lack of cytoplasm in the somata and the presence of multiple darkly stained chromatin bundles near the edges of the nuclei. Images are 40 ×40 µm virtual coronal slices (100 nm thick). d, Histological slice of Nissl stained coronal section including posterior parietal cortex from the Allen Brain Atlas (http://atlas.brain-map.org/). Higher density of cells is evident at the top of layer II (consistent with Fig. 2e). e, Rendering of cells included in connectivity analysis. Apical dendrites were traced in the XNH data (yellow) from somata (colored spheres) up to the layer I/II boundary where we collected an EM volume (cyan). Although the EM volume only contains short (< 50 µm) fragments of each AD, combining data across hundreds of neurons allowed us to map synaptic I/E balance over hundreds of micrometers of path length (Fig. 2h-i). f, Histogram of locations (cortical depth) of traced cells used for analysis of synaptic inputs onto apical dendrites. g, Synapse densities (excitatory in blue, inhibitory in red) plotted as a function of path-length to the initial bifurcation (as opposed to cell soma in Fig. 2i-j). Each marker corresponds to one dendrite fragment 10 µm long. Lines and shaded areas indicate binned average (20 µm bins) and interquartile range (mean ± SE). h, Inhibitory synapse fraction plotted as a function of path-length to the initial bifurcation. Each marker corresponds to one dendrite fragment colored based on the soma type. Lines and shaded areas indicate binned average and interquartile range (mean ± SE) for each soma type individually.

Extended Data Fig. 3 Millimeter-scale imaging of a Drosophila leg at single-neuron resolution.

a, 3D rendering of the dataset after individual scans were stitched together to form a continuous volume. b, The image volume was computationally unfolded (ImageJ) to reveal the entire 1.4 mm length of the main leg nerve. c, Volume rendering of the three hair plates that sense the thorax-coxa joint. The clusters are positioned differently within the joint, implying that they are sensitive to different joint angle ranges. d, Cross-section through the group of eight campaniform sensilla on the trochanter, revealing the underlying sensory neurons and their axons (blue, see Fig. 3c). e-g, Locations of sensory receptors in the leg. See also Supplementary Data Table 3. (e) Anterior view of external sensory structures. TiCSv1 and TiCSv2 are on the reverse (ventral) side of the tibia. (f) Posterior view of the trochanter, where large number of external mechanosensory structures reside. (g) Partially-transparent view of the leg revealing internal sensory structures (see Supplementary Data Table 3). Coxal stretch receptor: a previous report identified stretch receptor neurons in each of the distal leg segments (femur, tibia, and tarsus) that sense joint angles and are required for proper walking coordination35. We identified a neuron in the coxa whose morphology is consistent with the other stretch receptors and was possibly missed previously due to incomplete fluorescent labeling. This demonstrates that each major joint in the fly leg, and not only the distal joints, are monitored by a single stretch receptor neuron. Coxal strand receptor: we identified a single strand receptor in the coxa, innervated by a single sensory axon for which no cell body was visible in the leg. Strand receptor neurons are unique sensory neurons that have a cell body in the VNC instead of the leg71, but this type of neuron has only been previously identified in locusts and other orthopteran insects72. In this reconstruction, the strand receptor neuron’s axon enters the VNC through the accessory nerve, but could not be reconstructed back to its cell body in the VNC. h-k, Axons of some sensory neurons were large enough to reconstruct at the 150–200 nm resolution achieved here. Sensory neurons innervating coxal hair plates (cyan) and some trochanteral campaniform sensilla (blue) had axons with large diameters, similar in size to motor neuron axons (yellow). In contrast, axons of all chordotonal and bristle neurons were narrower. (h) Cross-section through the main leg nerve at the location indicated in (b). Axons from different sensory clusters bundle together. Two TrCS8 neurons have unusually large diameters (1050 nm and 850 nm, white circles; see Fig. 3c for full reconstruction of these axons). The remaining TrCS neurons have axon diameters of 430 ± 140 nm. Motor neurons (yellow) have diameters of 1-2 µm. The unresolved axons (areas indicated by red arrows) are chordotonal neurons and bristle neurons. (i) Cross-section through the ventral prothoracic nerve at the location indicated in (e). Axons from CoHP8 sensory neurons (blue, axon diameters of 1030 ± 90 nm) travel in this nerve, which also contains seven motor neuron axons (yellow), five of which innervate muscles in the coxa (left, axon diameters of 1140 ± 130 nm), and two of which innervate muscles in the thorax (left, axon diameters of 1880 and 2150 nm). The unresolved axons (red arrows) are likely bristle neurons. (j) Cross-section through the prothoracic accessory nerve at the location indicated in (e). Axons from CoHP4 sensory neurons (cyan, axon diameters of 1140 ± 240 nm) travel in this nerve. Shown here is a cross-section through one of two major branches of the prothoracic accessory nerve. This branch also contains five motor neuron axons (yellow, axon diameters 1610 ± 240 nm). (k) Cross-section through the dorsal prothoracic nerve at the location indicated in (e). Axons from CoHP3 sensory neurons (cyan, axon diameters of 1380 ± 20 nm) enter the VNC through this nerve. Shown here is the branch of the dorsal prothoracic nerve containing only the CoHP3 axons. Panels (h-k) are slices through reconstructed XNH volumes with 75 nm pixel size, subsequently Gaussian blurred with an 0.3 pixel radius. Axon diameters are reported as mean ± SD. l, Cross-section through the tibia. The nerve is substantially smaller than in Fig. 3d-g as only a subset of leg neurons extend into the tibia. m, Top: Morphology of a single motor neuron axon (green dye fill) innervating muscle fibers (red phalloidin stain) in the femur (image from Azevedo et al.39). Each fly has a single motor neuron with this recognizable morphology39,40. Bottom: XNH reconstruction of a motor neuron axon having the same recognizable morphology as the neuron as shown in the top panel. Red cylinders represent individual muscle fibers. n, Left: Morphology of the motor neuron LinB-Tr2 (image from Baek & Mann 200940, Copyright 1999 Society for Neuroscience). This motor neuron is born from Lineage B, the second largest lineage of motor neurons. Right: XNH reconstruction of motor neuron axon having the same recognizable morphology as the neuron shown in the left panel. The thin terminal branches were not resolved in the XNH reconstruction. References: 71 – Bräunig, P. & Hustert, R. Proprioceptors with central cell bodies in insects. Nature 283, 768–770 (1980). 72 – Bräunig, P. Strand receptors with central cell bodies in the proximal leg joints of orthopterous insects. Cell Tissue Res. 222, 647–654 (1982).

Extended Data Fig. 4 Automated Segmentation of Neuronal Morphologies using Convolutional Neural Networks (CNNs).

a, Overview of XNH image volume encompassing the anterior half of the VNC and the first segment of a front leg of an adult Drosophila (200 nm voxels). A smaller, higher resolution (50 nm voxels) volume centered on the prothoracic (T1) neuromere of the VNC and including the initial segment of the leg nerve was used for automatic segmentation. b, Schematic of U-NET CNN architecture used for automated segmentation (adapted from41). Each blue arrow represents two successive convolutions. c, Morphological comparison of the motor neuron with the largest-diameter branches out of all front leg motor neurons, reconstructed from three different flies using different modalities. Arrows indicate the largest-diameter branches, which match well across the three reconstructions. Left: Reconstruction using automated segmentation of XNH images. Gray segment indicates a merge error that was corrected during proofreading (Methods). Middle: Reconstruction from LM images of a dye-filled motor neuron labeled by 81A07-Gal4. This motor neuron controls the tibia flexor muscle and produces the largest amount of force of any fly leg motor neuron yet identified39. Adapted from Azevedo et al.39. Right: Skeleton reconstruction from EM images. Adapted from Maniates-Selvin et al.28. d, Population of 90 neurons used for evaluating segmentation error rates. Skeletons were categorized based on their morphologies (as in Fig. 4f)40,42,43. White circle indicates the boundary of the T1 neuropil. A, anterior; P, posterior. e, Examples of merge and split errors. True membrane locations are shown in black. Errors usually result from incorrect prediction of which voxels correspond to membranes. f, Average error rates of segmentation for the 90 neurons shown in (f). Automated segmentation is parametrized by an agglomeration threshold that amounts to a trade-off between split and merge errors. Data points indicate split and merge error rates for different agglomeration thresholds (Methods). The ideal segmentation minimizes the time needed to identify and fix split and merge errors during proofreading (red arrow). Merge error calculations based on comparisons to sparse manual tracing are likely an underestimate of the true number of merge errors. Note that the human-annotated, ground truth segmentation of XNH data excludes some areas where features are too small to resolve; thus these error metrics for XNH segmentation may not be directly comparable to what has been reported for EM. g-j, Automated segmentation of XNH data in mouse cortex (primary somatosensory, layer 5, 30 nm voxels). (g) Raw data (h) Affinities (zyx corresponding to RGB colors). i, selected segmentation labels corresponding to (e). (j) Selected 3D renderings of segmented neuron fragments. k, Large FOV segmentation of myelinated axons in the white matter below mouse parietal cortex. Segmentation of such myelinated axons can enable tracing of long-range inputs between brain areas at single-cell resolution.

Extended Data Fig. 5 Additional staining approaches for XNH imaging.

a, Top: Photograph of fly brain with GABAergic nuclei labeled with APEX2 (arrows). Middle: XNH images (120 nm pixels, 15 µm thick minimum intensity projection) of the same fly brain after heavy metal staining, showing clusters of dark, APEX2 labeled GABAergic cell nuclei (arrows). Bottom: XNH virtual slice (120 nm thick) and output from an automated Random Forest image classifier trained to detect labeled cells (green). b, XNH data (105 nm voxels) of a Drosophila brain that did not undergo heavy metal staining. Even in unstained soft tissue, phase-contrast imaging provides enough signal that single neurons can still be resolved. FOV encompasses the optic lobe and half of the central brain. See Supplementary Video 6 and Methods.

Supplementary information

Supplementary Information

Supplementary Tables 1–3 and descriptions of Supplementary Videos 1–6

Reporting Summary

Supplementary Video 1

An XNH scan of an adult Drosophila brain at a voxel size of 120 nm and a measured resolution of 183 nm. At this resolution, many individual neurons can be tracked as they travel between different brain regions. The FOV encompasses the entire central brain and part of both optic lobes, allowing brain-wide projections to be mapped.

Supplementary Video 2

An XNH scan of mouse somatosensory cortex (layer V) at a voxel size of 30 nm and a measured resolution of 87 nm. At this resolution, all myelinated axons and most unmyelinated axons and dendrites are resolved. Many subcellular features, most notably mitochondria and endoplasmic reticulum, are resolved.

Supplementary Video 3

An XNH scan of the adult Drosophila VNC, encompassing the T1 neuromere that controls movements of a front leg. This single scan at 50-nm voxels captures most of the T1 neuromere with sufficient resolution (139 nm) to reconstruct single neurons and identify different types of sensory and motor neurons (see Fig. 4). Body wall muscles surrounding the neuromere, as well as the motor neurons innervating those muscles, are also visible.

Supplementary Video 4

An XNH scan of part of the adult Drosophila leg at a voxel size of 75 nm and a measured resolution of 198 nm. The inset shows a zoom-in on the leg nerve, which contains motor neurons (large diameter axons) traveling out to muscles, as well as sensory neurons (variable diameter depending on type; see Fig. S3) bringing mechanosensory information to the central nervous system. The structural arrangement of muscles, fats and tendons and joints are visible (see Fig. 3).

Supplementary Video 5

Automated segmentation of a portion of the adult Drosophila VNC encompassing the T1 neuropil (same data as Supplementary Video 3 and Fig. 4). Each neuron is rendered in a color corresponding to neuron types determined based on 3D morphology (as in Fig. 4c). Blue, motor neurons; orange, campaniform sensillum neurons; purple, hair plate neurons; red, chordotonal organ neurons; green, bristle neurons.

Supplementary Video 6

An XNH scan at 105-nm voxel size of an adult Drosophila brain prepared without any heavy metal staining. Phase-contrast imaging enables this type of soft tissue to be imaged with reasonable contrast. As with stained samples (Videos 1–4), many individual neurons can be reconstructed traveling within and between brain regions.

Source data

Source Data Fig. 2

Statistical source data for Fig. 2e,f,h,i.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kuan, A.T., Phelps, J.S., Thomas, L.A. et al. Dense neuronal reconstruction through X-ray holographic nano-tomography. Nat Neurosci 23, 1637–1643 (2020). https://doi.org/10.1038/s41593-020-0704-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41593-020-0704-9

This article is cited by

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing