Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Sensorimotor experience remaps visual input to a heading-direction network

Abstract

In the Drosophila brain, ‘compass’ neurons track the orientation of the body and head (the fly’s heading) during navigation 1,2. In the absence of visual cues, the compass neuron network estimates heading by integrating self-movement signals over time3,4. When a visual cue is present, the estimate of the network is more accurate1,3. Visual inputs to compass neurons are thought to originate from inhibitory neurons called R neurons (also known as ring neurons); the receptive fields of R neurons tile visual space5. The axon of each R neuron overlaps with the dendrites of every compass neuron6, raising the question of how visual cues are integrated into the compass. Here, using in vivo whole-cell recordings, we show that a visual cue can evoke synaptic inhibition in compass neurons and that R neurons mediate this inhibition. Each compass neuron is inhibited only by specific visual cue positions, indicating that many potential connections from R neurons onto compass neurons are actually weak or silent. We also show that the pattern of visually evoked inhibition can reorganize over minutes as the fly explores an altered virtual-reality environment. Using ensemble calcium imaging, we demonstrate that this reorganization causes persistent changes in the compass coordinate frame. Taken together, our data suggest a model in which correlated pre- and postsynaptic activity triggers associative long-term synaptic depression of visually evoked inhibition in compass neurons. Our findings provide evidence for the theoretical proposal that associative plasticity of sensory inputs, when combined with attractor dynamics, can reconcile self-movement information with changing external cues to generate a coherent sense of direction7,8,9,10,11,12.

This is a preview of subscription content, access via your institution

Access options

Rent or buy this article

Prices vary by article type

from$1.95

to$39.95

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: E–PG neurons are inhibited by visual cues at specific positions.
Fig. 2: Visual receptive fields of E–PG neurons align with heading tuning.
Fig. 3: R neurons drive visually evoked inhibition in E–PG neurons.
Fig. 4: Visuomotor experience can persistently change E–PG ensemble representations of heading direction.
Fig. 5: Visuomotor experience can remap visual input to E–PG neurons contingent on postsynaptic activity.

Data availability

The datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request.

Code availability

Analysis code is available at https://github.com/wilson-lab/FisherLuDAlessandroWilson_AnalysisCode.

References

  1. Seelig, J. D. & Jayaraman, V. Neural dynamics for landmark orientation and angular path integration. Nature 521, 186–191 (2015).

    ADS  CAS  PubMed  PubMed Central  Google Scholar 

  2. Kim, S. S., Rouault, H., Druckmann, S. & Jayaraman, V. Ring attractor dynamics in the Drosophila central brain. Science 356, 849–853 (2017).

    ADS  CAS  PubMed  Google Scholar 

  3. Green, J. et al. A neural circuit architecture for angular integration in Drosophila. Nature 546, 101–106 (2017).

    ADS  CAS  PubMed  PubMed Central  Google Scholar 

  4. Turner-Evans, D. et al. Angular velocity integration in a fly heading circuit. eLife 6, e23496 (2017).

    PubMed  PubMed Central  Google Scholar 

  5. Seelig, J. D. & Jayaraman, V. Feature detection and orientation tuning in the Drosophila central complex. Nature 503, 262–266 (2013).

    ADS  CAS  PubMed  PubMed Central  Google Scholar 

  6. Omoto, J. J. et al. Neuronal constituents and putative interactions within the Drosophila ellipsoid body neuropil. Front. Neural Circuits 12, 103 (2018).

    CAS  PubMed  PubMed Central  Google Scholar 

  7. Skaggs, W. E., Knierim, J. J., Kudrimoti, H. S. & McNaughton, B. L. A model of the neural basis of the rat’s sense of direction. Adv. Neural Inf. Process. Syst. 7, 173–180 (1995).

    CAS  PubMed  Google Scholar 

  8. Milford, M. J., Wyeth, G. F. & Prasser, D. RatSLAM: A hippocampal model for simultaneous localization and mapping. In Proc. International Conference on Robotics and Automation 403–408 (2004).

  9. Mulas, M., Waniek, N. & Conradt, J. Hebbian plasticity realigns grid cell activity with external sensory cues in continuous attractor models. Front. Comput. Neurosci. 10, 13 (2016).

    PubMed  PubMed Central  Google Scholar 

  10. Cope, A. J., Sabo, C., Vasilaki, E., Barron, A. B. & Marshall, J. A. A computational model of the integration of landmarks and motion in the insect central complex. PLoS ONE 12, e0172325 (2017).

    PubMed  PubMed Central  Google Scholar 

  11. Keinath, A. T., Epstein, R. A. & Balasubramanian, V. Environmental deformations dynamically shift the grid cell spatial metric. eLife 7, e38169 (2018).

    PubMed  PubMed Central  Google Scholar 

  12. Ocko, S. A., Hardcastle, K., Giocomo, L. M. & Ganguli, S. Emergent elasticity in the neural code for space. Proc. Natl Acad. Sci. USA 115, E11798–E11806 (2018).

    CAS  PubMed  PubMed Central  Google Scholar 

  13. Taube, J. S., Muller, R. U. & Ranck, J. B. Jr. Head-direction cells recorded from the postsubiculum in freely moving rats. I. Description and quantitative analysis. J. Neurosci. 10, 420–435 (1990).

    CAS  PubMed  PubMed Central  Google Scholar 

  14. Taube, J. S., Muller, R. U. & Ranck, J. B. Jr. Head-direction cells recorded from the postsubiculum in freely moving rats. II. Effects of environmental manipulations. J. Neurosci. 10, 436–447 (1990).

    CAS  PubMed  PubMed Central  Google Scholar 

  15. Mizumori, S. J. & Williams, J. D. Directionally selective mnemonic properties of neurons in the lateral dorsal nucleus of the thalamus of rats. J. Neurosci. 13, 4015–4028 (1993).

    CAS  PubMed  PubMed Central  Google Scholar 

  16. Knierim, J. J., Kudrimoti, H. S. & McNaughton, B. L. Place cells, head direction cells, and the learning of landmark stability. J. Neurosci. 15, 1648–1659 (1995).

    CAS  PubMed  PubMed Central  Google Scholar 

  17. Zhang, K. Representation of spatial orientation by the intrinsic dynamics of the head-direction cell ensemble: a theory. J. Neurosci. 16, 2112–2126 (1996).

    CAS  PubMed  PubMed Central  Google Scholar 

  18. Xie, X., Hahnloser, R. H. & Seung, H. S. Double-ring network model of the head-direction system. Phys. Rev. E 66, 041902 (2002).

    ADS  Google Scholar 

  19. Hanesch, U., Fischbach, K. F. & Heisenberg, M. Neuronal architecture of the central complex nin Drosophila melanogaster. Cell Tissue Res. 257, 343–366 (1989).

    Google Scholar 

  20. Zhang, Z., Li, X., Guo, J., Li, Y. & Guo, A. Two clusters of GABAergic ellipsoid body neurons modulate olfactory labile memory in Drosophila. J. Neurosci. 33, 5175–5181 (2013).

    CAS  PubMed  PubMed Central  Google Scholar 

  21. Jacob, P. Y. et al. An independent, landmark-dominated head-direction signal in dysgranular retrosplenial cortex. Nat. Neurosci. 20, 173–175 (2017).

    CAS  PubMed  Google Scholar 

  22. Kim, S. S., Hermundstad, A. M., Romani, S., Abbott, L. F. & Jayaraman, V. Generation of stable heading representations in diverse visual scenes. Nature https://doi.org/10.1038/s41586-019-1767-1 (2019).

    ADS  CAS  PubMed  PubMed Central  Google Scholar 

  23. Cadena, C. et al. Past, present, and future of simultaneous localization and mapping: toward the robust-perception age. IEEE Trans. Robot. 32, 1309–1332 (2016).

    Google Scholar 

  24. Sun, Y. et al. Neural signatures of dynamic stimulus selection in Drosophila. Nat. Neurosci. 20, 1104–1113 (2017).

    CAS  PubMed  Google Scholar 

  25. Wehner, R. Astronavigation in insects. Annu. Rev. Entomol. 29, 277–298 (1984).

    Google Scholar 

  26. Wehner, R. & Müller, M. The significance of direct sunlight and polarized skylight in the ant’s celestial system of navigation. Proc. Natl Acad. Sci. USA 103, 12575–12579 (2006).

    ADS  CAS  PubMed  PubMed Central  Google Scholar 

  27. el Jundi, B., Smolka, J., Baird, E., Byrne, M. J. & Dacke, M. Diurnal dung beetles use the intensity gradient and the polarization pattern of the sky for orientation. J. Exp. Biol. 217, 2422–2429 (2014).

    PubMed  Google Scholar 

  28. el Jundi, B., Foster, J. J., Byrne, M. J., Baird, E. & Dacke, M. Spectral information as an orientation cue in dung beetles. Biol. Lett. 11, 20150656 (2015).

    PubMed  PubMed Central  Google Scholar 

  29. Bell, W. J., Tobin, T. R. & Sorensen, K. A. Orientation responses of individual larder beetles, Dermestes ater (Coleoptera, Dermestidae), to directional shifts in wind stimuli. J. Insect Behav. 2, 787–801 (1989).

    Google Scholar 

  30. Heinzel, H.-G. & Böhm, H. The wind-orientation of walking carrion beetles. J. Comp. Physiol. A 164, 775–786 (1989).

    Google Scholar 

  31. el Jundi, B. et al. Neural coding underlying the cue preference for celestial orientation. Proc. Natl Acad. Sci. USA 112, 11395–11400 (2015).

    ADS  CAS  PubMed  PubMed Central  Google Scholar 

  32. Pegel, U., Pfeiffer, K. & Homberg, U. Integration of celestial compass cues in the central complex of the locust brain. J. Exp. Biol. 221, jeb171207 (2018).

    PubMed  Google Scholar 

  33. Müller, M. & Wehner, R. Wind and sky as compass cues in desert ant navigation. Naturwissenschaften 94, 589–594 (2007).

    ADS  PubMed  Google Scholar 

  34. el Jundi, B. et al. A snapshot-based mechanism for celestial orientation. Curr. Biol. 26, 1456–1462 (2016).

    CAS  PubMed  Google Scholar 

  35. Dacke, M. et al. Multimodal cue integration in the dung beetle compass. Proc. Natl Acad. Sci. USA 116, 14248–14253 (2019).

    CAS  PubMed  PubMed Central  Google Scholar 

  36. Jammalamadaka, S. R. & SenGupta, A. Topics in Circular Statistics (World Scientific, 2001).

  37. Jenett, A. et al. A GAL4-driver line resource for Drosophila neurobiology. Cell Rep. 2, 991–1001 (2012).

    CAS  PubMed  PubMed Central  Google Scholar 

  38. Wang, J., Zugates, C. T., Liang, I. H., Lee, C. H. & Lee, T. Drosophila Dscam is required for divergent segregation of sister branches and suppresses ectopic bifurcation of axons. Neuron 33, 559–571 (2002).

    CAS  PubMed  Google Scholar 

  39. Pfeiffer, B. D. et al. Refinement of tools for targeted gene expression in Drosophila. Genetics 186, 735–755 (2010).

    CAS  PubMed  PubMed Central  Google Scholar 

  40. Hoopfer, E. D., Jung, Y., Inagaki, H. K., Rubin, G. M. & Anderson, D. J. P1 interneurons promote a persistent internal state that enhances inter-male aggression in Drosophila. eLife 4, e11346 (2015).

    PubMed  PubMed Central  Google Scholar 

  41. Hardie, R. C. et al. Calcium influx via TRP channels is required to maintain PIP2 levels in Drosophila photoreceptors. Neuron 30, 149–159 (2001).

    CAS  PubMed  Google Scholar 

  42. Chen, T. W. et al. Ultrasensitive fluorescent proteins for imaging neuronal activity. Nature 499, 295–300 (2013).

    ADS  CAS  PubMed  PubMed Central  Google Scholar 

  43. Nern, A., Pfeiffer, B. D. & Rubin, G. M. Optimized tools for multicolor stochastic labeling reveal diverse stereotyped cell arrangements in the fly visual system. Proc. Natl Acad. Sci. USA 112, E2967–E2976 (2015).

    ADS  CAS  PubMed  PubMed Central  Google Scholar 

  44. Goodman, M. B. & Lockery, S. R. Pressure polishing: a method for re-shaping patch pipettes during fire polishing. J. Neurosci. Methods 100, 13–15 (2000).

    CAS  PubMed  Google Scholar 

  45. Green, J., Vijayan, V., Mussells Pires, P., Adachi, A. & Maimon, G. Walking Drosophila aim to maintain a neural heading estimate at an internal goal angle. Preprint at https://doi.org/10.1101/315796 (2018).If ref. 45 (preprint) has now been published in final peer-reviewed form, please update the reference details if appropriate.If ref. 45 has now been accepted or published in peer-reviewed form, please update the reference.

  46. Gouwens, N. W. & Wilson, R. I. Signal propagation in Drosophila central neurons. J. Neurosci. 29, 6239–6249 (2009).

    CAS  PubMed  PubMed Central  Google Scholar 

  47. Wolff, T., Iyer, N. A. & Rubin, G. M. Neuroarchitecture and neuroanatomy of the Drosophila central complex: a GAL4-based dissection of protocerebral bridge neurons and circuits. J. Comp. Neurol. 523, 997–1037 (2015).

    PubMed  Google Scholar 

  48. Moore, R. J. et al. FicTrac: a visual method for tracking spherical motion and generating fictive animal paths. J. Neurosci. Methods 225, 106–119 (2014).

    PubMed  Google Scholar 

  49. Reiser, M. B. & Dickinson, M. H. A modular display system for insect behavioral neuroscience. J. Neurosci. Methods 167, 127–139 (2008).

    PubMed  Google Scholar 

  50. Klapoetke, N. C. et al. Independent optical excitation of distinct neural populations. Nat. Methods 11, 338–346 (2014).

    CAS  PubMed  PubMed Central  Google Scholar 

  51. Schindelin, J. et al. Fiji: an open-source platform for biological-image analysis. Nat. Methods 9, 676–682 (2012).

    CAS  PubMed  Google Scholar 

  52. Buchanan, S. M., Kain, J. S. & de Bivort, B. L. Neuronal control of locomotor handedness in Drosophila. Proc. Natl Acad. Sci. USA 112, 6700–6705 (2015).

    ADS  CAS  PubMed  PubMed Central  Google Scholar 

  53. Pnevmatikakis, E. A. & Giovannucci, A. NoRMCorre: an online algorithm for piecewise rigid motion correction of calcium imaging data. J. Neurosci. Methods 291, 83–94 (2017).

    CAS  PubMed  Google Scholar 

Download references

Acknowledgements

We thank D. Anderson, T. Clandinin, B. Pfeiffer, G. Rubin and J. Tuthill for providing fly stocks; T. Clandinin, B. Bean, J. Drugowitsch, D. Ginty and members of the Wilson laboratory for providing feedback on the manuscript and J. Drugowitsch for providing advice on data analysis; G. Maimon for sharing designs for a fly holder and modified FicTrac software; O. Mazor and P. Gorelik at the Harvard Medical School Research Instrumentation Core (NEI Core Grant for Vision Research EY012196) for their help constructing the virtual-reality systems. This work was supported by the Harvard Neurobiology Imaging Facility (NINDS P30 NS072030). This work was funded by NIH awards U19NS104655, F30DC017698 (to J.L.) and T32GM007753 (to J.L.). Y.E.F. is supported by a HHMI Hanna H. Gray Fellowship. R.I.W. is an HHMI Investigator.

Author information

Authors and Affiliations

Authors

Contributions

Y.E.F., J.L. and R.I.W. designed the study. Y.E.F. performed and analysed electrophysiology experiments. J.L. performed and analysed two-photon calcium-imaging experiments. I.D. performed and analysed confocal-microscopy imaging experiments. Y.E.F. and R.I.W. wrote the manuscript with input from J.L. and I.D.

Corresponding author

Correspondence to Rachel I. Wilson.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Peer review information Nature thanks Lisa Giocomo and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Extended data figures and tables

Extended Data Fig. 1 Measuring behaviour and E–PG visual responses.

a, Side view of a fly walking on an air-cushioned ball during an electrophysiology experiment. b, Image of the ball and plastic holder. Air flows up through the holder and out the semi-spherical depression that cradles the ball. c, Schematic of the experimental set-up viewed from above. The fly is secured in an aperture in the centre of a horizontal platform. The platform is surrounded by a circular panorama. The panorama is composed of square LED arrays49 (2 squares vertically × 12 squares horizontally). The ball is illuminated by an infrared (IR) LED, which is visible as a red spot in b. A camera captures an image of the ball to enable tracking using FicTrac48. Inset shows FicTrac view. Camera and infrared LED are not drawn to scale. d, The yaw velocity of the fly compared to the cue position. This is the dataset that is the basis for Fig. 1f, but here broken down into averages for each individual fly, and with right (+) and left (−) cue positions kept separate. Positive velocities are right turns, and negative velocities are left turns. No tests showed a statistically significant yaw velocity (P < 0.05, two-sided comparison to bootstrap distribution) for any individual fly at any cue position. For details of analysis, see Methods, ‘Yaw during open-loop epochs’. e, Yaw velocity in response to the visual cue presentation. This analysis is the same as that shown in Fig. 1f, but here yaw velocity is plotted against the distance of the cue jump between consecutive trials. As in Fig. 1f, we show mean (black) ± 1 s.d. (grey) across experiments (73 experiments in 68 flies). Magenta lines show the bootstrapped 95% confidence interval of the mean across flies after randomizing cue positions, Bonferroni-corrected for multiple comparisons. Because the mean lies within these bounds, it is not significantly different from random. This analysis further supports the conclusion that there is no systematic yaw response to the random flashes of the vertical bar. For details of analysis, see Methods, ‘Yaw during open-loop epochs’. f. The visual receptive field of an example cell measured multiple times over the course of a 40-min recording. Each row shows data from a separate visual mapping epoch. Data from this example cell are also shown in Fig. 1e. Note the stability of the visual receptive field over this time period. For experiments shown in this figure, we used UAS-mCD8::GFP/UAS-mCD8::GFP; R60D05-Gal4/R60D05-Gal4 flies.

Extended Data Fig. 2 Visually evoked hyperpolarization and depolarization, during and after cue presentation.

a, Example voltage responses of the same E–PG neuron to two cue positions. Dashed lines indicate the mean baseline voltage before the cue. This neuron is hyperpolarized by the cue at 90° and depolarized by the cue at −97°. Note that hyperpolarization decays more rapidly than depolarization. In b, to quantify visual receptive fields, we measured the change in voltage during cue presentation and after cue removal in the 250-ms windows marked in a with brackets, in both cases relative to baseline. b, Summary of E–PG visual receptive fields measured during cue presentation. Cells are sorted by the cue position that evokes maximal hyperpolarization. The histogram shows the number of E–PG neurons with maximal hyperpolarization at each cue position (73 E–PG neurons in 68 flies). c, Summary of E–PG visual receptive fields measured after cue removal. Cell order is the same as in b. Note that hyperpolarizing responses tend to decay, whereas depolarizing responses tend to persist; this is consistent with the hypothesis that the hyperpolarization during cue presentation is due to direct synaptic inhibition from R neurons, whereas depolarization is polysynaptic and caused by withdrawal of tonic synaptic inhibition. The histogram shows the number of E–PG neurons with maximal hyperpolarization after cue removal for each cue position. d, Same as b, but sorted by the cue position that evoked maximal depolarization (minimal hyperpolarization), as in Fig. 1g. e, Same as c, but with the cell order as in d. f, Summed response across all neurons measured during (left) and after (right) the cue. The left curve has a pair of minima around ±100°; this bias is probably inherited from R neuron receptive fields, which are biased towards positions offset from the visual midline5. By contrast, the right curve is relatively flat. g, Visual cue position eliciting maximal depolarization (minimum hyperpolarization), plotted versus E–PG neuron location, for the 21 recorded E–PG neurons that were filled. No signification correlation was observed (circular correlation coefficient = −0.15, P = 0.49)36. For experiments shown in this figure, we used UAS-mCD8::GFP/UAS-mCD8::GFP; R60D05-Gal4/R60D05-Gal4 flies.

Extended Data Fig. 3 E–PG neuron pairs recorded sequentially from the same brain.

a, Two biocytin filled dendrites (green) from sequentially recorded E–PG neurons that innervate adjacent wedges within the ellipsoid body. Neuropil reference marker is shown in grey (anti-nc82 antibody). Images are maximum intensity z-projections. Scale bar, 10 μm. The schematic shows the approximate position of ellipsoid body and E–PG dendrites from a coronal view of the fly brain. b, c. Heading tuning (red, measured in VR) and visual receptive field (blue, measured with random flashes) from sequentially recorded E–PG pairs from two example flies. Dendritic locations of the recorded neurons are green in the ellipsoid body schematic above each set of plots. In both cases, by chance, the two dendrites were physically adjacent. In both cases, adjacent E–PG neurons from the same fly exhibited similar visual receptive fields and heading tuning curves, supporting the conclusion that adjacent E–PG cells typically receive inhibition from adjacent regions of visual space and represent adjacent heading directions. Comparing the visual receptive field and the heading tuning curve for each neuron yielded correlation coefficients (Pearson’s) of 0.76 (fly 1 neuron 1), 0.90 (fly 1 neuron 2), 0.95 (fly 2 neuron 1) and 0.65 (fly 2 neuron 2). For experiments shown in this figure, we used UAS-mCD8::GFP/UAS-mCD8::GFP; R60D05-Gal4/R60D05-Gal4 flies.

Extended Data Fig. 4 Visual receptive fields and heading tuning of E–PG neurons.

Heading tuning (red, closed-loop mode) and visual receptive fields (blue, open-loop mode) for all 40 recorded E–PG neurons (from 39 flies). For each neuron, the correlation coefficient (Pearson’s) is reported for the comparison between the visual receptive field and the heading tuning curve. Asterisks denote data also shown in Fig. 2. For experiments shown in this figure, we used UAS-mCD8::GFP/UAS-mCD8::GFP; R60D05-Gal4/R60D05-Gal4 flies.

Extended Data Fig. 5 R neurons types labelled by R20A02-Gal4 and R54E12-Gal4 described by MCFO.

a, Observed numbers of R neurons belonging to each type from a dataset of n = 78 single-neuron MCFO clones43 from the R20A02-Gal4 line. R neuron types were classified according previously published methods6. b. Same as in a but for the R54E12-Gal4 line (n = 61 single-neuron MCFO clones). ch, Examples of single R neuron MCFO clones. Images are maximum intensity z-projections. Background labelling was manually removed to improve clarity of specific neuronal morphologies. i, Multiple R neuron MCFO clones labelled in different colours using the R20A02-Gal4 line. Image is a maximum-intensity z-projection. Scale bars, 20 μm. For experiments shown in this figure, we used R57C10-FLPG5.PEST; UAS(FRT.stop)myr::smGdP-HA, UAS(FRT.stop)myr::smGdP-V5, UAS(FRT.stop)myr::smGdP-Flag/R20A02-Gal4, R57C10-FLPG5.PEST; UAS(FRT.stop)myr::smGdP-HA, UAS(FRT.stop)myr::smGdP-V5, UAS(FRT.stop)myr::smGdP-Flag/R54E12-Gal4 flies.

Extended Data Fig. 6 Suppressing R neuron activity with two independent driver lines reduces visually evoked hyperpolarization in E–PG neurons.

a, Same as Fig. 3c, except instead of measuring peak visually evoked hyperpolarization, we measured mean visually evoked hyperpolarization (by zeroing all non-negative visual responses and then averaging visual responses across all cue positions). From left to right: n = 8, 10, 12, 10, 9. Both Kir2.1 means are significantly different from corresponding genetic controls using two-sided Wilcoxon rank-sum tests. R20A02 Kir2.1 versus R20A02/+ and UAS/+ (P = 0.0013 and P = 0.0003, respectively), R54E12 Kir2.1 versus R54E12/+ and UAS/+ (P = 0.005 and P = 0.0025, respectively). b, R neuron population labelled by Kir2.1::eGFP. Images are maximum intensity z-projections. c, Numbers of R neurons per hemisphere expressing Kir2.1::eGFP in each experimental genotype, n = 9 (R20A02) and n = 11 (R54E12) (horizontal lines are means). On the basis of the previously reported total number of R neurons of each type6 and our MCFO quantification of the R neuron types labelled by R20A02-Gal4 and R54E12-Gal4 (Extended Data Fig. 5), these cell counts suggest that R20A02-Gal4 targets approximately 20% of R2, 30% of R4m and all R4d neurons. These counts suggest that R54E12-Gal4 targets approximately 40% of R2 neurons and all R4m and R4d neurons. This incomplete targeting of outer R neurons may provide one explanation for the remaining visually evoked inhibition observed in some recordings (Fig. 3). Note that although both driver lines label other neurons in the central brain and visual system, R neurons appear to be the only cell type that is labelled by both lines. In the visual system, the driver line R20A02-Gal4 targets one medulla intrinsic neuron, probably Mi12, and one cell type that arborizes in around layers 4–6 of the lobula, whereas the driver line R54E12-Gal4 appears to target the medulla neuron Tm3. For experiments shown in this figure, we used +/w; R60D05-LexA/LexAop-mCD8::GFP; +/UAS-Kir2 (UAS-only control); +/w; R60D05-LexA/LexAop-mCD8::GFP; R20A02-Gal4/+ (R20A02 Gal4-only control); +/w; R60D05-LexA/LexAop-mCD8::GFP; R54E12-Gal4/+ (R54E12 Gal4-only control); +/w; R60D05-LexA/LexAop-mCD8::GFP; R20A02-Gal4/UAS-Kir2.1 (R20A02 Kir2.1); and +/w; R60D05-LexA/LexAop-mCD8::GFP; R54E12-Gal4/UAS-Kir2.1 (R54E12 Kir2.1) flies.

Extended Data Fig. 7 Offset probability histograms in training experiments.

Offset probability histograms during each segment of the training experiments shown in Fig. 4, for all 19 GCaMP imaging experiments (in 19 flies). As in Fig. 4, the circular mean during the pre-training period is defined as offset0 (here marked with an arrowhead), and for display purposes we horizontally aligned all of the offset0 values in different flies. Asterisks mark data shown in Fig. 4. For experiments shown in this figure, we used +/w; UAS-GCaMP6f/+; R60D05-Gal4/+ flies.

Extended Data Fig. 8 Heading tuning and visual receptive field measurements in training experiments.

Heading tuning curves and visual receptive fields for all additional 17 E–PG neurons (from 17 flies) from the training experiments in Fig. 5. As in Fig. 5, red solid curves are heading tuning. The red dashed curves are the change in heading tuning (training minus pre-training). Blue curves are visual receptive fields. The blue dashed curve is the change in the visual receptive field (second probe minus first probe). Seven neurons from this dataset are also shown in Figs. 1, 2.

Extended Data Fig. 9 Controls for remapping experiments.

a, Data reproduced from Fig. 5e. Absolute change in visual receptive fields. Control flies navigated in a one-cue world (rather than a two-cue world) during the waiting period between the open-loop epochs used to compute the change in visual responses. In some cases (matched control), flies received exactly the same protocol as the experimental condition except with one-cue closed-loop epochs during the training period; in other words, these matched controls received 12 consecutive minutes of one-cue (rather than two-cue) closed-loop epochs during the training period. In all other cases (control), flies received 4-min blocks of one-cue closed-loop epochs interleaved with 150-s open-loop epochs during the training period, which lasted 12 min or more. b. Visual receptive fields from control cells. Blue dashed curve is the change in visual receptive field (second probe minus first probe) over the control period. Typically, visual receptive fields were stable over time under control conditions (control neurons 2 and 3). On occasion, we observed spontaneous changes in the visual receptive field of an E–PG neuron during the control period (for example, control neuron 1), although these changes were not as large as the changes that we observed in many neurons in trained flies (see a). c, Heading tuning in the same three control cells. Note how the spontaneous changes in visual receptive fields seen in neuron 1 are accompanied by changes in heading tuning.

Supplementary information

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fisher, Y.E., Lu, J., D’Alessandro, I. et al. Sensorimotor experience remaps visual input to a heading-direction network. Nature 576, 121–125 (2019). https://doi.org/10.1038/s41586-019-1772-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41586-019-1772-4

This article is cited by

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing