Abstract
Vision plays a crucial role in instructing the brain’s spatial navigation systems. However, little is known about how vision loss affects the neuronal encoding of spatial information. Here, recording from head direction (HD) cells in the anterior dorsal nucleus of the thalamus in mice, we find stable and robust HD tuning in rd1 mice, a model of photoreceptor degeneration, that go blind by approximately one month of age. In contrast, placing sighted animals in darkness significantly impairs HD cell tuning. We find that blind mice use olfactory cues to maintain stable HD tuning and that prior visual experience leads to refined HD cell tuning in blind rd1 adult mice compared to congenitally blind animals. Finally, in the absence of both visual and olfactory cues, the HD attractor network remains intact but the preferred firing direction of HD cells drifts over time. These findings demonstrate flexibility in how the brain uses diverse sensory information to generate a stable directional representation of space.
Similar content being viewed by others
Introduction
Our visual system provides critical and up-to-date information about the world around us, facilitating navigation through the environment and enabling quick reactions to dynamic events. Vision facilitates navigation-related tasks, including landmarking, obstacle avoidance, and the generation of an internal cognitive spatial map1,2,3,4. In the absence of vision, other sensory modalities need to fill the previously dominant role of vision in generating spatial awareness and guiding navigation. In humans, while it is clear that visually impaired individuals can successfully form spatial maps and navigate in many environments, behavioral studies have noted differences in various aspects of spatial awareness and navigation between sighted and visually impaired individuals4,5,6. However, little is known about how the brain’s spatial navigation systems, which have been best examined in freely moving rodent studies7, adapt following vision loss.
The brain has dedicated systems for generating spatial awareness and guiding spatial navigation. In rodents, some of the most studied components of the brain’s spatial awareness system include place cells2, grid cells8, and head direction cells9—although other complementary spatially-tuned cells also exist7. To examine the effect of vision loss on part of the brain’s spatial navigation system, we decided to focus on head direction (HD) cells in the anterior dorsal nucleus (ADn) of the thalamus, where the spike rate of a majority of neurons is modulated by head direction10,11, but the stability of the system following vision loss has not been explored.
The head direction system encompasses a network of interconnected brain regions that incorporate angular head velocity signals with additional sensory information about environmental landmarks in order to generate cells with highly tuned and stable preferences for head direction12. Ablation/silencing studies have shown that intact HD cells are required for proper spatial navigation13,14,15 and accurate spatial representation within the brain’s other spatial navigation systems16,17. The HD system appears to be organized as an attractor network11,18,19,20, such that pairs of simultaneously recorded HD cells maintain a similar angular difference between their preferred firing directions across exposures to different rooms and following environmental manipulations that alter tuning preferences21. As such, cells with similar HD tuning preferences tend to fire coherently even when tuning preferences are unstable22 or during sleep11,19.
Although vestibular inputs are critical for the HD signal23,24,25, vision also appears to play an important role in anchoring and providing stability to the HD system. For example, horizontally displacing a visual cue in an environment reliably causes a concomitant shift in the preferred firing direction (PFD) of HD cells21. However, many studies have reported that HD cell responses are relatively stable in the absence of visual inputs or visual cues14,26,27,28. Furthermore, recordings from HD cells in young rodents just before eye-opening revealed tuned HD cells, although tuning curves were found to be broader than following eye-opening22,29,30. It has thus been argued that the HD system only requires idiothetic inputs (i.e., internally generated sensory signals, such as vestibular, proprioceptive, or motor efference copy signals), with vision only providing a refining allothetic (i.e., externally generated) sensory input12,31,32. In contrast, other studies have reported highly unstable HD cell responses following the removal of visual inputs in adult rodents33,34, arguing for the possible requirement of external sensory inputs in stabilizing HD tuning. Additionally, while it is clear that vision can exert a strong effect on HD cell tuning preferences, the extent to which other sensory systems can provide landmarking cues and stabilize HD cell tuning remains unclear. While rodents have been found capable of using both auditory and olfactory cues to guide spatial learning and navigation35,36,37, previous studies with HD cells have suggested that auditory, olfactory, and vibrissal systems are relatively ineffective in modulating HD cell tuning31,38. Thus, it remains uncertain what effect vision loss might have on HD cell responses. Here, to examine the effect of vision loss on HD cells in adult mice, we recorded HD cells in ADn of both sighted and blind animals (including blind adult animals both with and without prior visual experience) and explored the extent to which HD cell responses were altered following vision loss, and whether other sensory systems could be leveraged to tune the HD system.
Results
Robust head direction tuning in blind mice
Is HD cell tuning stable following vision loss? To examine this, we recorded from neurons in ADn of rd1 mice39, a rodent model of retinitis pigmentosa in which mice are born with normal vision but go blind by ~1 month old, due to photoreceptor degeneration40. For all experiments, recording probes were implanted in ADn of adult mice (2–4 months old), and animals were placed in a circular open-field arena (Fig. 1a; 60 cm diameter; black walls with a single visual cue; see Methods). The animal’s position and head direction were tracked with a set of IR video cameras while spiking responses of HD cells were recorded (see Methods). Following experiments, spike sorting was performed to isolate individual HD cells, based on methods previously described for identifying HD cells in sighted animals (see Methods). Unless otherwise noted, HD cell responses were analyzed over the entirety of 10 min-long recording sessions (see Methods).
Upon placing blind adult rd1 mice in the open-field environment, we found robust HD cell tuning (Fig. 1b; n = 151 HD cells recorded from 13 animals, with 80.7% of recorded cells in ADn defined as HD cells; see Methods). Similar to what has been described for sighted animals, we found that rd1 mice evenly sampled all angular directions in the environment and possessed HD cells that exhibited preferred firing directions that spanned all head directions (Fig. 1c).
We next tested the stability of HD cells in blind mice and the extent to which their HD cell network was able to reliably encode head direction. First, we found that the preferred direction of HD cells remained stable over a single 10 min recording session (Fig. 1d), and across repeated exposure to the same arena (Fig. 1e), with levels of stability similar to what we found for sighted animals in the light (Supplementary Fig. 1). Next, we examined whether the HD cell network in ADn of blind mice was providing a reliable readout of the animal’s head direction by performing a decoding analysis41,42. We found that simultaneously recorded cells provided robust and stable decoding of HD (Fig. 1f). Therefore, in blind rd1 mice, HD cells are highly tuned and provide an accurate readout of head direction.
To ensure that rd1 animals were indeed blind, we performed visual cue rotation experiments in the light. Upon rotating a visual cue, HD cells in rd1 mice failed to follow the cue, unlike HD cells from sighted animals in the light (Fig. 1g). Furthermore, various metrics that we computed for HD cells in rd1 mice (e.g., mean and peak firing rates, resultant vector length, mutual information (MI), and tuning width) were statistically similar in both light and dark environments (Fig. 1h). Thus, despite a total absence of rod and cone based visual inputs, HD cell tuning is robust and stable in blind rd1 animals, providing accurate information about head direction, meaning that blind mice can generate a stable allocentric spatially guided map.
HD cell tuning is more robust in blind animals than in sighted animals placed in the dark
How do HD cell responses in blind animals compare to those of sighted animals? To test this, we recorded HD cell responses from sighted animals in the light (WTL) and in complete darkness (WTD; see Methods; Fig. 2a). A similar percentage of cells recorded in ADn passed the criteria to be designated as HD cells in rd1 and WTL mice (Fig. 2b; see Methods), whereas a significantly smaller percentage of ADn cells were designated as HD cells in WTD mice (Fig. 2b). Thus, HD cell tuning is significantly more robust in blind animals than sighted animals placed in the dark, meaning that in our experimental arena blind animals have adapted an alternative non-visual strategy for stabilizing HD cell tuning.
Next, we compared several metrics for HD cell responses between blind and sighted mice. For vector length, tuning width, and mutual information, each group of mice was statistically different from one another, demonstrating the following hierarchy in HD tuning refinement: WTL > rd1 > WTD (Fig. 2c; Supplementary Fig. 2a, b). Furthermore, if we limited our analysis in WT animals to HD cells that were recorded in both light and dark environments, and performed paired statistical tests, we found a similar relationship with HD tuning being significantly impaired for WT animals in the dark compared to the light (Supplementary Fig. 2c). Thus, sighted mice exhibit significant impairment in HD cell tuning when placed in the dark, with a level of impairment much more severe than is seen in blind animals, for whom HD cell tuning—although slightly less refined than in sighted animals in the light—is robust and stable.
Prior visual experience leads to refined HD cell tuning in blind adult mice
We next tested whether vision might be required during development for the proper maturation of HD cell tuning, even if mice subsequently go blind. For instance, in the superior colliculus, it has been shown that normal vision is required during development for the proper formation of auditory maps43,44. In mice, eye-opening occurs ~P10–1245,46, and rd1 mice have attenuated vision for a few days following eye-opening until they go fully blind around 1 month old40,47. To test whether vision around the time of eye-opening is required to refine and mature the tuning of the HD system, we performed experiments in Gnat2cpfl3 Gnat1irdr/Boc mice (subsequently referred to as Gnat1/2mut), who are congenitally blind due to dysfunctional rod and cone photoreceptors (see Methods). Recording from adult Gnat1/2mut mice (2–4 months old), we found many highly tuned HD cells (Fig. 3a; n = 128 HD cells recorded from 8 animals) with a similar percentage of cells in ADn designated as HD cells for Gnat1/2mut mice as we found for rd1 mice (Fig. 3b). Similar to rd1 mice, HD cells in Gnat1/2mut mice maintained stable preferred firing directions throughout a 10 min session, as well as across repeated exposure to the same environment (Supplementary Fig. 3a). Furthermore, similar to rd1 mice, responses and tuning of HD cells in Gnat1/2mut mice were statistically similar in both light and dark conditions (Supplementary Fig. 3b). Finally, at the population level, simultaneously recorded HD cells provided a reliable decoding of the animal’s head direction (Supplementary Fig. 3c). However, based on the metrics that we calculated from HD cell responses during exploration of the open-field environment, we found that HD cell tuning was statistically less refined in Gnat1/2mut mice compared to rd1 mice on several metrics tested (Fig. 3c; Supplementary Fig. 3d,e). These results indicate that while visual inputs are not required for generating stable HD tuning in adult blind mice—either in rd1 or Gnat1/2mut animals—normal visual inputs in the days after eye-opening appear to be required for refinement and maturation of vision-independent HD tuning in adult blind mice.
The vibrissal system is not required for head direction cell tuning in blind animals in the open-field environment
We next tested whether blind mice were using an alternative sensory modality to landmark their HD system within the open-field environment. As all our experiments were performed with a speaker playing white noise placed immediately underneath the center of the open-field arena, we deemed it unlikely that auditory inputs were contributing to HD cell tuning in our experiments (see also ref. 31). We first examined the vibrissal system, hypothesizing that a mouse might be surveying the walls and floor of the open-field arena with its whiskers to aid in spatial awareness. To test this, we shaved the whiskers from blind mice (see Methods) and examined the effect on HD cell responses (Fig. 4a). Whisker shaving did not result in any noticeable change in HD cell tuning, except for a reduction in the peak firing rate (Fig. 4a, b; we pooled together rd1 and Gnat1/2mut mice as HD cells in both were similarly unaffected by whisker shaving; Supplementary Fig. 4a, b). Whisker shaving similarly had no effect on HD cell tuning of sighted animals in the light (although, again, it led to a small reduction in peak firing rate; Supplementary Fig. 4c). Thus, in our experimental arena, the vibrissal system does not appear to play a major role in HD cell tuning in either blind or sighted animals.
The olfactory system is required for stabilizing head direction cell tuning in blind animals
Due to the relative stability of preferred firing directions observed across repeated exposures that we had previously noted for blind mice, we suspected they might be using olfaction to anchor the HD system since the floor was not cleaned between repeated exposures. To test this, we performed a floor rotation experiment. Here, instead of rotating a visual cue on the wall as we did previously for visual cue rotation experiments, we rotated the floor (i.e., the animal was removed from the room, the floor was rotated without being cleaned, and the animal was reintroduced to the room). In blind mice, floor rotation resulted in a concomitant shift in the preferred firing direction of HD cells (Fig. 5a; we pooled together rd1 and Gnat1/2mut mice as results were similar between these two blindness models; Supplementary Fig. 5a, b), suggesting that olfaction could modulate HD cell tuning. In blind animals, floor rotation led to concomitant shifts in the preferred direction of HD cells at levels similar to that observed in sighted animals in light following visual cue rotations (Supplementary Fig. 5).
To directly test whether olfactory inputs were required for stabilizing HD cell tuning in blind mice, we ablated olfactory sensory neurons (OSNs) and examined the effect on HD cell tuning. To ablate OSNs, we used a previously established chemical lesioning method48 (see Methods). To ensure that this method was effective at ablating OSNs and severely impairing olfaction, we developed an olfactory place avoidance task in which a mouse was placed in a two-chamber arena, with one of the chambers housing an aversive olfactory substance (3-methyl-1-butanethiol49) and the other chamber housing a neutral olfactory substance (distilled H2O; Fig. 5b; see Methods). Mice with intact OSNs completely avoided the chamber containing the aversive olfactory substance (Fig. 5b, c). In contrast, following OSN ablation, mice showed no preference between the two chambers (Fig. 5b, c). Having established a method to reliably ablate olfaction, we tested the effect of olfactory ablation on HD cell tuning in blind animals. Olfactory ablation resulted in complete loss of HD cell tuning in ADn of blind mice (Fig. 5d, e; we pooled together rd1 and Gnat1/2mut mice as OSN ablation affected both similarly (Supplementary Fig. 5c, d)). The percent of ADn cells that passed the criteria for being designated as HD cells drastically decreased for blind mice following olfactory ablation (Fig. 5e, Supplementary Fig. 5e, f). For some experiments, whiskers were intact during olfactory ablation, while in other animals, whiskers were ablated prior to olfactory ablation, but in both cases, olfactory ablation resulted in complete loss of HD cell tuning (Supplementary Fig. 5), again indicating that, at least in this experimental paradigm, the vibrissal system was not being used to tune HD cells. These results show that blind mice use olfactory cues to anchor their HD system.
We next examined the effect that olfactory ablation had on HD cells in blind animals. To test this, used an alternative method to reliably identify HD cells in ADn regardless of whether or not they exhibited strong tuning preferences for head direction. Previous work50 noted that HD cells in ADn exhibit highly distinctive autocorrelograms. Similarly, we found that HD cells in ADn maintain their distinctive autocorrelograms in blind animals and following OSN ablation (Fig. 5f, Supplementary Fig. 5). Following Veijo and Peyrache50, we implemented a machine learning approach, using Extreme Gradient Boosting51 (XGB) to classify neurons as either HD or non-HD cells based on the shape of their autocorrelogram. First, we used our standard method to define cells as either HD cells (which rely on cells having highly selective preferred firing directions for a given head direction; see Methods) or non-HD cells, and trained the classifier using an equal number of HD and non-HD cells from WTL mice. To assess model generalizability, here, the classifier was validated on data from blind controls (pooled rd1 and Gnat1/2mut mice; Fig. 5g). Furthermore, the classifier characterized a similar percentage of cells in ADn in blind animals with intact OSNs and following OSN ablation (Fig. 5h; see Methods). We, therefore, used the classifier to define HD cells in ADn of blind mice following OSN ablation, allowing us to compare their response properties with HD cells in blind animals with intact OSNs. By using the same model to identify HD cells in blind controls and OSN ablated animals, we found that OSN ablation led to a significant decrease in vector length (Fig. 5i). These results indicate that removing olfaction from blind animals results in untuned HD cells in ADn.
Olfaction can modulate HD cell tuning in sighted animals
We next examined whether olfaction could also impact HD cell tuning in sighted animals, as we noted that HD cell tuning was more robust in sighted animals in the dark (WTD) than in blind animals following OSN ablation (Fig. 6a). We thus wondered whether the weak but remnant tuning of HD cells in sighted animals in the absence of visual inputs could be arising from olfactory cues. To test this, we performed floor rotation experiments. Unlike for blind animals, HD cells in sighted animals in the dark did not exhibit significant PFD shift towards the direction of floor rotation (Supplementary Fig. 6a). However, upon ablating OSNs in sighted animals, the percentage of cells recorded in ADn that passed the criteria to be designated as HD cells (using our standard method) significantly decreased in WTD (Fig. 6b), while the number of HD cells in ADn defined with an XGB model trained on blind data did not change (Fig. 6b), indicating that ablating olfaction significantly impaired the tuning of HD cells of WT animals in the dark. Ablating olfaction did not significantly affect the number of cells in ADn characterized as HD cells in WT animals in the light (Supplementary Fig. 6b), meaning that for these animals, turning the lights on and off toggled HD cell tuning from normal to completely untuned (Fig. 6c; Supplementary Fig. 6c). Thus, sighted animals can use both vision and olfaction to provide cue anchoring information to the HD system, though in the open-field environment we used for experiments, vision appears to be more effective at tuning the HD system. Only in the absence of both visual and olfactory cues does the HD system become completely untuned.
Attractor dynamics in the absence of allothetic (external) sensory cues
We next tested how HD attractor dynamics were affected following the loss of cue anchoring visual and olfactory signals. We found that in the absence of both visual and olfactory cues, while HD cells became completely untuned, the activity of simultaneously recorded cells in ADn was well-described by a one-dimensional ring manifold generated with Isomap52 (Fig. 7a), similar to the HD network in blind control animals (Fig. 7a). A persistent homology topological analysis computed on the Isomap data19 was consistent with a one-dimensional ring architecture being preserved in the absence of vision, as well as in the absence of both vision and olfaction, even though in the absence of both vision and olfaction the population activity on the ring manifold representing the moment-to-moment head direction of the animal became decoupled from the actual head direction (Fig. 7a–c; Supplementary Fig. 7a–c; note that for the ‘No Vision and No Olfaction’ group, we pooled together blind animals following olfactory ablation and sighted animals in the dark (WTD) following olfaction ablation). For such a ring manifold to persist in the absence of visual and olfactory inputs, the relative spike timing of HD cells with respect to one another must be maintained as the animal moves around the environment, even though each cell no longer maintains a stable preferred firing direction over time. How could this arise? Previous work has shown that prior to eye-opening, while HD cell tuning is broad when measured over an extended period of time, if tuning curves are computed on short timescales then much more refined tuning curves can be measured22. Therefore, instead of computing tuning curves by averaging data across 10 min recording sessions as we have done until now, following the removal of both visual and olfactory inputs, we computed tuning curves each time an animal made an angular head rotation of 360° (which occurred on average every 25.4 ± 10.3 sec; see Methods). On these shorter intervals, we found that HD cells were highly tuned (Fig. 7c, d) and exhibited tuning curves similar to HD cells in control conditions (Supplementary Fig. 7d, e), but that their preferred firing directions continuously drifted over time, with all simultaneously recorded cells drifting in a coherent manner (Fig. 7c). Thus, in the absence of cue anchoring visual and olfactory signals, though HD cell tuning becomes unstable over long periods of time, the attractor network in ADn remains intact and HD cells exhibit sharp tuning curves over short time intervals.
We further examined the drift of the preferred firing direction of HD cells in the absence of both visual and olfactory inputs. To visualize this, we performed decoding using the Isomap analysis (Fig. 7f). We noticed that while the decoding became significantly impaired (Fig. 7g), the decoded head direction appeared to sometimes co-vary along with changes in actual head direction (Fig. 7f), though the total angular distance traveled by the animal was larger than the total decoded angular distance traveled (Fig. 7g). To test if a correlation existed between the animal’s head rotations and drift of preferred firing directions within the HD network, we compared angular head velocity signals (AHV; which is the vestibular signal provided to the head direction network) to angular drift velocity (ADV; which is a measure of the speed of the population drift of decoded head direction; see Methods). We found that in a subset of sessions, AHV and ADV were significantly correlated (Fig. 7i, j), suggesting that in at least some animals, the drift in preferred firing directions of HD cells was modulated by ongoing head rotations. Finally, as it has been previously reported in rodents before eye-opening that the broad tuning curves of their HD cells arise in part from ‘under-signaling’ of AHV signals, with the error between actual AHV and decoded AHV increasing as a function of actual AHV22, we tested if a similar phenomenon might underlie poor HD cell tuning in the absence of both vision and olfaction. To test this, we measured the error between AHV and decoded AHV (decoded using the Isomap analysis) and found that in the ‘No vision, No olfaction’ group, decoded AHV undershot the actual AHV, particularly for higher AHV values (Fig. 7k). Therefore, while the HD attractor network is intact in the absence of visual and olfactory inputs, either visual or olfactory landmarking cues are required to anchor the attractor network and enable stable HD cell tuning. Without such allothetic external sensory cues, the preferred firing direction of HD cells becomes unanchored and drifts, such that the HD network in ADn no longer encodes the animal’s actual head direction. For some animals, the drift of preferred firing directions appears to be coupled to ongoing angular head velocity signals, and at least part of the error in properly decoding head direction appears to arise from the under-signaling of angular head velocity signals.
Discussion
Recording from head direction cells in ADn of both sighted and blind mice, we find stable and robust HD tuning in all animals. Remarkably, the HD system is flexible in which sensory system it can use for obtaining reliable anchoring to environmental landmarks: sighted animals predominantly use visual signals, whereas blind animals use olfactory signals. Additionally, there appears to be a critical period soon after eye-opening in which vision is required to fully refine and mature the HD system. Finally, in the absence of both visual and olfactory cues, the HD attractor network remains intact, but lacking any environmental anchors to lock onto, preferred firing directions of HD cells drift over time.
External (allothetic) sensory cues are required for stabilizing the head direction system
The essential input to the HD system is thought to be vestibular—neurotoxic lesions of the vestibular nerve abolished HD cell tuning in ADn23,25. In contrast, though vision is known to play a strong role in landmarking the HD cell system12, since many studies have found relatively stable HD cell tuning following the removal of familiar visual cues or when animals were placed in the dark14,26,27,28, it has been argued that animals only require self-generated idiothetic sensory cues for stabilizing HD cell tuning12,31,32. However, a purely idiothetic mechanism for HD cell stability appears unlikely, since, with an exclusively internal path-integration system, each error and imperfection would be left uncorrected, likely leading over time to an unstable system. Our data is consistent with a model in which idiothetic (internal) and allothetic (external) sensory signals play complementary roles in stabilizing HD cell tuning: ablating olfaction in blind animals, and placing sighted animals in the dark following olfactory ablation both lead to total loss of HD cell tuning stability. This means that animals require externally generated, allothetic environmental cues processed via either visual or olfactory systems, or both together, in order to stabilize the preferred firing direction of HD cells in ADn. Our results also indicate that idiothetic cues alone (for instance arising from vestibular, proprioceptive, or motor efference copy systems) are insufficient to stabilize HD cell tuning in ADn, at least over long periods of time. Furthermore, our results suggest that previous studies describing stable HD cell tuning in the absence of visual inputs likely arose from either insufficient removal of all visual cues or from the presence of olfactory cues. Thus, vestibular and visual/olfactory signals appear to play complementary roles in stabilizing HD cell tuning in ADn.
Aside from visual and olfactory inputs, our results rule out a contribution of vibrissal signals in stabilizing HD cell tuning within our specific experimental environment, consistent with recent findings from the somatosensory cortex38. Nonetheless, one can imagine that in a different environmental context (e.g., where the walls or object contours provide useful spatial information) that the vibrissal system could possibly affect HD cell tuning. Next, since our experiments were done in the presence of white noise, we did not directly test possible auditory control of HD cell tuning. As such, future studies will be required to test for the possible use of auditory cues in stabilizing HD cell tuning in blind mice, though previous work in blindfolded rats suggests that auditory cues do not exert a strong effect on HD cell tuning31. However, it should be noted that a recent study found that some mice can echolocate53.
Olfaction, HD cell tuning, and cognitive maps
Our results indicate that vision and olfaction can be used, either independently or in tandem, to stabilize HD cell tuning. Visual signals are believed to enter the HD system via inputs from the visual cortex to the post-subiculum and retrosplenial cortex12. With respect to olfactory inputs, the pathway into the HD system remains unclear, though direct projections from the olfactory bulb and piriform cortex to the entorhinal cortex54,55 could, in turn, reach ADn via post-subiculum, which is reciprocally connected to the entorhinal cortex and ADn56. Alternatively, olfactory signals could reach the HD system indirectly via the projection of the post-subiculum to the lateral mammillary bodies, which constitute the main subcortical input to the ADn12. Future studies examining these pathways in more detail in both sighted and blind mice will be required to develop a better understanding of the circuitry involved.
We find that, at least in our open-field environment, vision promotes the most refined HD cell tuning, with olfaction driving very robust but slightly less refined HD tuning in blind mice. Furthermore, blind mice appear more capable of using olfaction to stabilize HD cell tuning than sighted animals, which could relate to the finding that blind mice appear to have larger olfactory bulbs and perform better on some olfactory tests than sighted controls57. Overall, our results are consistent with a foraging study in rats that posited a sensory hierarchy in spatial navigation, with vision being at the top of the hierarchy, followed by olfaction58, though it is possible that such a hierarchy could change depending on the exact nature of the environment and which sensory system is most informative in a given condition.
Outside the HD cell system, a recent study has shown that spatial maps can be found in the piriform cortex during a spatial-olfactory learning task, with spatially encoding piriform cortex cells being linked to hippocampal theta37. Another recent study found that olfactory cues can modulate place cell responses in the hippocampus and aid in path integration36. Consistent with these findings, earlier studies in rats found intact place cells in blind animals59, as well as a contribution of olfactory inputs to place cell stability60. Therefore, seeing as olfactory inputs can enable robust and stable place cell and HD cell coding in the brain, olfaction needs to be considered an important sensory system for the generation of cognitive maps.
What olfactory cues do mice use to landmark their HD system? Previous work indicates that rodents can use a variety of odors, including self-generated odors, conspecific odors, and other non-animal odors, to guide spatial navigation36,37,61. In our experiments, for each mouse, its first open-field exposure began on a new floor, which had never been explored by another mouse, meaning the mice had to use cues intrinsic to the floor or that were self-generated.
How does a mouse use olfaction to guide spatial navigation? Odor cues that are located on the floor, which appear to be the cues the animals used in our experiments to guide spatial awareness, would likely smell the same when approached from every direction. Thus, the animal would likely need to consider more than a single odor cue in order to gain useful spatial information. In such a scenario, the animal would need to distinguish spatial location by the relative intensity of multiple distinct odor cues. Furthermore, as the intensity of the olfactory cue will change based on the animal’s distance from the cue, it would be advantageous if the mouse was actively monitoring multiple odor gradients during navigation. Is such a strategy feasible for mice? First, rodents appear capable of smelling in stereo62, which would greatly enhance olfactory spatial resolution and olfactory navigation capabilities. Second, there is evidence from experiments in rodents that olfactory coding can function in ‘snapshots’—i.e., discreet olfactory coding bouts linked to each sniff63—with a single sniff being sufficient for fine odor discrimination64. Third, rodents can identify a specific odor within a mixture of odors64,65. As such, it appears that the rodent olfactory system functions in a manner that would enable it to effectively serve spatial navigation needs.
Vision around the time of eye-opening is required for refining HD cell tuning
Hubel and Wiesel discovered a critical period during development for ocular dominance plasticity in the visual cortex66, and similar critical periods have been shown to exist for many other sensory systems and behaviors67. Here, we provide evidence of a critical period in the refinement and maturation of the HD system in ADn that depends on visual inputs in the period shortly after eye-opening—the evidence being that rd1 mice, who have attenuated vision upon eye-opening before going blind around P3040,47, have more refined HD cell tuning as adults than congenitally blind Gnat1/2mut mice. Both rd1 and Gnat1/2mut mice go blind due to problems with retinal photoreceptors. rd1 mice go blind as a result of photoreceptor degeneration caused by a mutation in phosphodiesterase in rod photoreceptors (due to a mutation of the Pde6B gene), which initially leads to rod death followed by cone death, and this is a commonly used mouse model of retinitis pigmentosa39. Gnat1/2mut mice are blind as a result of mutations in both rod and cone forms of the alpha subunit of the G-protein transducin (due to mutations in both Gnat1 and Gnat2 genes), resulting in nonfunctioning rod and cone photoreceptors (see Methods). As the mutated genes in both mouse lines are predominantly expressed in photoreceptors, it is likely that the difference in HD cell tuning in blind adult Gnat1/2mut vs. rd1 mice is a direct result of the timing of the onset of vision loss (congenital vs. ~P30). We thus propose that vision around the time of eye-opening allows the HD cell system to stabilize (i.e., upon eye-opening, vision enables HD cells to exhibit heightened stability in their preferred direction tuning29,30), and this stability results in refinement and maturation of the HD network, such that though both rd1 and Gnat1/2mut are equally blind as adults, rd1 mice have more refined HD cell tuning. These findings are consistent with multisensory studies in the superior colliculus, which showed the importance of vision during development for enabling the generation of normal auditory space maps43,44,68.
Attractor dynamics in the presence and absence of anchoring external sensory inputs
The HD network is often modeled as a continuous ring attractor11,18,19,20. Evidence in favor of an attractor network is plentiful, from the finding that the relative difference in preferred directions between a given pair of HD cells is maintained across different environments and during visual cue rotation experiments21, to the finding that HD cells that fire coherently during spatial navigation continue to do so during sleep11, to the finding that before eye-opening—when HD cell tuning curves are broad—pairs of cells exhibit similar coherence during spatial navigation as is seen in the adult HD system22,29,30. Furthermore, a recent manifold analysis has validated a one-dimensional ring as a robust description of the rodent HD network19. Our results in blind mice are consistent with those of a continuous ring attractor. Performing a manifold analysis revealed a one-dimensional ring attractor similar to that found in normally sighted animals. Additionally, upon ablating OSNs in blind animals, while this completely removed stability of HD cell preferred direction, attractor dynamics and the ring architecture remained intact.
We find that—for normally sighted and blind animals—the removal of both visual and olfactory inputs causes the ‘hill of activity’ within HD attractor network to drift independently from the animal’s true head direction (i.e. the preferred direction of all simultaneously recorded cells drifts coherently). For some animals, there was a significant correlation between angular head velocity and angular drift velocity of PFDs. Furthermore, we found that in the absence of vision and olfaction, the decoded angular velocity (AHV) under-signaled the true AHV.
A similar drift in HD cell preferred firing direction appears to have been described in a previous study recording from HD cells in the lateral dorsal nucleus in rats placed in the dark, though those animals were placed in a radial maze, making it difficult to draw exact parallels to our results33. In a more recent study in mice placed in the dark, it was reported that ~40% of HD cells became unstable in the dark, with one example HD cell recording being shown where the instability resulted in the preferred firing direction exhibiting a drift similar to what we describe34. However, a follow-up paper from the same lab showed no effect of dark exposure on HD cell stability and instead found that optogenetic silencing of the nucleus prepositus hypoglossi (NHP)—which relays vestibular signals to the HD system—was required in tandem with dark exposure to cause the preferred direction of a subset of HD cells recorded in ADn to drift over time14. This latter study also found that preferred firing direction drift was sometimes correlated with the animal’s head turns14. Finally, at least some portion of the drift in HD preferred firing directions that we find appears to be related to the effect described for HD cells prior to eye-opening, where the broadness of tuning curves resulted in part from an under-signaling of angular head velocity22. Nonetheless, future work is required to better understand the circuit and synaptic basis whereby different sensory inputs enable tuning within the HD attractor network to stabilize, and how perturbations to different sensory modalities can alter the input to the HD network and result in drift.
Relating head direction cell recordings in blind rodents to spatial cognition in visually impaired humans
While vision is often considered the most important sensory system for guiding spatial cognition in humans, numerous studies have shown that visually impaired individuals maintain robust spatial cognition4,5,69,70. Unlike in blind mice—where olfaction appears to be the most important sensory system for guiding spatial awareness following vision loss—in visually impaired humans, the auditory system is generally thought to be a more important contributor to vision-independent spatial awareness, with blind humans appearing to exhibit enhanced sound localization compared to sighted subjects71 and performing echolocation via self-produced sounds72,73,74. However, there is emerging evidence that humans can also effectively use olfaction to inform spatial cognition75,76,77, even being able to navigate using stereo olfaction78, meaning that olfaction could be used following vision loss to aid in spatial awareness and navigation79. Relatedly, visually impaired humans appear to have enhanced olfactory abilities compared to sighted individuals80,81,82,83, which could further help visually impaired individuals use olfaction to guide spatial awareness and navigation. Next, while there is active debate surrounding the extent to which visual experience in humans is required for developing normal spatial cognitive abilities4,6, some studies have indicated that late-blind individuals perform better on certain spatial cognition tasks than congenitally blind individuals84,85,86,87, reminiscent of our findings regarding HD cell tuning being slightly more refined in rd1 mice compared to congenitally blind Gnat1/2mut mice. Thus, our findings related to HD cells in sighted and blind mice are likely highly relevant for understanding the neural underpinnings of spatial cognition in humans following vision loss.
Methods
Animals
All procedures were performed in accordance with the Canadian Council on Animal Care and approved by the Montreal Neurological Institute’s Animal Care Committee. Three strains of mice were used in this study: wild-type mice (C57Bl/6; Charles River strain code 027), rd1 mice (also known as C3H; The Jackson Laboratory #000661), and Gnat1/2mut mice (Gnat2cpfl3 Gnat1irdr/Boc mice; The Jackson Laboratory #033163). All mice were adults (2–4 months old; P75-125) weighing between 20 and 32 g. Mice of both sexes were included. Mice were maintained on a standard 12 h light, 12 h dark cycle, in ventilated and humidity-controlled racks, at standard room temperature.
Surgery and implantation
Mice were anesthetized with a cocktail containing fentanyl (0.05 mg/kg), medetomidine (0.5 mg/kg), and midazolam (5 mg/kg)88, and a craniotomy was performed above ADn for probe implantation. A conductive wire was inserted into the cerebellum to serve as a reference. After attaching the reference wire, a 4-shank silicon probe (Neuronexus Inc. Ann Arbor, MI; 200 µm inter-shank spacing) with 8 recording sites on each shank (probe model Buz32) was lowered into the brain towards ADn based on the following stereotactic coordinates: antero-posterior (AP) −0.4 mm; medio-lateral (ML) −0.76 mm; dorso-ventral (DV) 2.16 mm. The base of a movable drive holding the silicon probe was then fastened to the skull using dental acrylic cement and a light-cure adhesive (Kerr OpitBond Universal Unidose) to allow for stable recordings during awake open-field sessions.
Electrophysiological recordings
After recovery (5–7 days postsurgery), during sleep, the probe was advanced daily in steps of <300 µm in the home cage until HD units were detected in ADn. Prior to open-field recording sessions, screenings were done during sleep on the homepage. Preliminary detection of putative HD cells was based on inspection of autocorrelograms of recorded units, as previously described50. Once putative HD units were detected, experiments were conducted by placing animals in an open-field dark cylindrical arena (60 cm in diameter) with a single white visual cue placed on the wall above the animal’s reach. Open-field recordings lasted for 10 min per session. The neurophysiological signals were acquired at 20 kHz using an Intan RHD2000 Recording System (16-bit, analog plexin). The raw neuronal signal was high-pass filtered and processed with an automated spike sorting algorithm to extract single units (Kilosort289). Isolated units were manually curated in Klusters90 based on autocorrelograms and spike waveforms. For a given animal, after encountering cells with the recording electrode, the electrode was advanced ~100 µm per day, and thus cells from a given animal that were recorded on different days were considered to be different cells, meaning that a single animal could contribute multiple ‘sessions’ with unique cells to a given experiment—the only exception being the recordings across days following OSN ablation shown in Fig. 5d, where cells were recorded over a 2-day period, and the sessions were merged to form a single continuous recording before spike sorting to ensure that the same units were maintained for further analysis.
Position tracking
For position tracking, an infrared-based (850 nm IR) camera recording system equipped with 8 cameras recording at 120 FPS was mounted above the recording arena to capture the movements of the animal (Optitrack Flex 13). Four reflective infrared markers were attached to the animal’s head-stage for tracking (6.4 mm Optitrack M3 Markers). After recording, Motive motion capture software (Optitrack) was used to extract both the direction and position of the head relative to the environment in three dimensions. It should be noted that for dark experiments, we found that sighted mice could see the visual cue in ‘the dark’ when we used the 850 nm IR illumination that came with the Optitrack cameras (i.e., HD cells followed the visual cue in ‘the dark’ when it was moved to a new location (Supplementary Fig. 1)). This is consistent with findings from a recent paper about red light perception in rats91. As such, for dark experiments, we installed a custom-made array of 940 nm infrared LEDs (Adafruit, Product # 388) that we used for illumination and marker detection. Importantly, under 940 nm IR illumination HD cells did not exhibit visual cue-controlled changes in their preferred firing direction.
Standard HD cell classification
To identify HD cells, as mentioned above, we first used autocorrelograms to localize the electrode in ADn and identify putative HD cells located in ADn50. Subsequently, for all cells recorded in ADn, tuning curves were constructed by aligning spike times with the closest head direction point in the horizontal plane. Spikes were then counted for bins of 6°. To correct for angular sampling bias, spike rates were normalized by the time spent in each angular bin, as previously described11. The resulting tuning curves representing the normalized firing rate of each cell as a function of head direction were then smoothed with a Gaussian kernel (s.d. = 3). Units that passed a Rayleigh test for significant non-uniform circular distribution (p < 0.0001), had a z-test score > 50, and a peak firing rate > 1 Hz, were classified as HD cells.
Extreme gradient boosting (XGB) model for HD cell classification
Owing to the loss of stable HD cell tuning in blind animals following olfactory sensory neuron ablation, cells recorded in ADn in this condition were classified as HD/non-HD cells using an XGB model50,51, implemented in Python. In control animals (sighted animals in the light (WTL)) we used the standard HD cell classification method (see above) to define cells in ADn as either HD or non-HD cells. Next, we generated autocorrelograms from these HD and non-HD cells using 2 ms bins and a window of 400 ms. To capture potential variability in HD autocorrelogram shapes over the course of each session, autocorrelograms were generated from the 1st and 2nd half of each recording session before stacking them together. Since autocorrelograms are symmetrical, for further processing, we only used the halves corresponding to the positive lag. The following pre-processing steps were carried out prior to model training. First, we ensured that the classes (HD/non-HD) were equally represented in the training set. Second, each half of the stacked autocorrelograms was smoothed with a Gaussian kernel (s.d. = 1.5). Next, using default parameters, the XGB model was trained on all pre-defined (i.e., standard-method-labeled HD and non-HD cells) autocorrelograms, and then the performance was evaluated on the data from blind mice (pooled between rd1 and Gnat1/2mut mice). The model performance on the test data had an accuracy of 90% and an F1 score of 90.5% in correctly classifying cells recorded in ADn of blind animals (Fig. 5g). Next, the trained model was used to classify ADn cells recorded in blind animals following OSN ablation as either HD or non-HD cells based on their autocorrelograms (Fig. 5h, i). In addition, in order to use the XGB model to define HD cells in WTD following OSN ablation (Fig. 6b), we trained the model on HD cells recorded from blind animals to ensure no data leakage between the testing and training set.
Visual cue rotation
The extent to which the single white cue card on the arena wall controlled the preferred firing direction (PFD) of HD cells was tested by rotating the visual cue by either 90° or 180°. Following a standard session where the baseline PFDs for all simultaneously recorded HD cells was established, the animal was removed from the arena and disoriented, as previously described10. The animal was then reintroduced to the arena after the visual cue was rotated. For all visual cue rotation experiments, the floor of the arena was thoroughly cleaned with 70% ethanol to eliminate olfactory cues on the floor before the animal was reintroduced. The visual cue control measure (gain) shown in Fig. 1g was computed as follows:
where, \(\triangle\)PFD is defined as the change in the preferred firing direction (in degrees) for each HD cell following reintroduction into the arena, and \(\triangle {{{{{{\mathrm{cue}}}}}}\; {{{{{\mathrm{is}}}}}}}\) defined as the extent (either 90° or 180°) that the visual cue was rotated.
Floor rotation
To assess floor-controlled changes in PFDs of HD cells, a protocol similar to the visual cue rotation described above was adopted, except where the floor of the arena was rotated. For these experiments, to avoid possible interference of odor markings across different animals, a brand new floor was used for each mouse. At the same time, to give mice the opportunity to use olfactory odors on the floor, the floor of the arena was not cleaned between exposures. The extent to which floor rotations influenced the PFD of HD cells across animals was computed using Eq. (1). For cue/floor rotation sessions, 10,000 shuffles were generated using re-exposure sessions where animals were re-exposed to the same environment in the absence of any manipulation. To generate the gain values expected by chance, the gain was computed after randomly applying the experimental cue/floor rotation angles to the re-exposure data.
Olfactory sensory neuron (OSN) ablation
To eliminate (or at least greatly reduce) the sense of smell, OSNs were chemically ablated based on established methods48,92. In brief, mice were anesthetized with isoflurane after which a blunted 33-gauge needle was used to administer 20 μl of ZnSO4 (10 % in sterile H2O) in both nostrils. Previous work has shown that this leads to the rapid death of OSNs48,92. After intranasally administering the chemical, mice were inverted to drain the excess fluid from the nasal cavity. Behavioral and electrophysiological recordings were carried out at least 24 h following OSN ablation.
2-Chamber olfactory test
To test the efficacy of OSN ablation, in some treated animals, we tested their sense of smell using a 2-chamber odor test. Animals were first habituated to the chambers for 5 min each day over a period of 3 days, with no odors added to the chambers. After habituation, testing began by placing two filter papers infused with 10 µl of 3-methyl-1-butanethiol (aversive odor49) in a randomly selected chamber, and two filter papers infused with 10 µl of distilled water (neutral odor) in the other chamber. All mice were placed in the neutral chamber at the start of the test and were monitored for 10 min using an overhead camera to record their baseline occupancies in both chambers. Post-OSN ablation, the test was repeated, and compared to the baseline. The chamber preference (CP) score in Fig. 5 was computed as follows:
where tneut and tavers refers to the time spent in the neutral and aversive chamber, respectively. Chamber preference scores of −1 and 1 represent a strong preference for the aversive and neutral sides, respectively, where the animal spent the entire time in one chamber.
Bayesian decoding
To predict the HD of the animal, given the spiking activity of HD cells within a given time window, a Bayesian decoding algorithm41 was used to compute the posterior probability distribution using the formula:
where dir represents the set of possible angular head directions, and spk represents the spike counts of simultaneously recorded HD cells within a 200 ms time window in this instance. Bayesian decoding was used to predict the HD of animals in the second half of a recording session based on priors generated from tuning curves in the first half. The mean absolute decoding error was computed as the absolute difference between the decoded angle and the actual angle over the decoded time window.
Isomap projection
Isomap analysis was performed on sessions with at least 10 HD cells based on a threshold previously used50. Similar to Viejo and Peyrache50, spike counts were binned (200 ms) and square-root transformed to normalize for variance in spike rates. The normalized rates from all simultaneously recorded units in ADn served as inputs to the Isomap algorithm52, implemented in Python93. The number of neighbors was set to 50. The resulting output of the algorithm formed the basis of the low dimensional ring manifolds shown in Fig. 7a and Supplementary Fig. 7a–c. The color code of the population vector on the ring manifold was mapped onto the corresponding actual HD for each corresponding time bin. Isomap decoded angular head velocity was generated by taking the angular difference between two consecutive points after computing the element-wise arc tangent of the point cloud. For Isomap shuffles, each cell’s binned spike counts were randomly shifted in time to generate a shuffled matrix before applying the Isomap algorithm (Fig. 7a, Supplementary Fig. 7a–c).
Drift analysis
Following the loss of vision and olfaction, the population drift was computed using the Isomap ring manifold to provide a fine temporal resolution. The drift was computed with the following steps. First, the actual HD was down-sampled to match the time bin (200 ms) used for computing the Isomap ring manifold. Following this, the formula below was used to compute the angular difference (Angdiff) between the actual and Isomap decoded HD.
where HDactual and HDiso refers to the actual and Isomap decoded HD, respectively. Next, the angular difference over time was unwrapped and smoothed with a Gaussian kernel (s.d. = 2), from which drift was computed as the absolute average rate of change (Fig. 7e). The shuffles shown in Fig. 7j were computed by correlating the AHV of each animal to the ADV of all other animals to generate a null distribution of AHV vs ADV correlations.
Quantification of ADn units
To ensure that cells counted on recording shanks were recorded in ADn, the following was implemented: first, autocorrelograms were generated from all cells on shanks that picked up at least 1 HD cell. Second, a Fourier transform was applied to the autocorrelograms to identify and exclude theta (4–8 Hz) modulated cells. Although some studies have reported the presence of theta modulation in surrounding anterior thalamic structures94,95, a recent study found that cells in ADn are not theta modulated50. Hence the total counts of ADn cells were derived from all non-theta modulated units. Finally, the proportion of ADn units characterized as HD cells was derived from the total number of units classified as HD cells (see Methods: HD classification) divided by the total number of cells recorded in ADn. In Figs. 2 and 3, the difference between proportions was tested using a 2-sample Z-test for proportions as previously described22.
Analysis of head direction cell
Mean firing rate: The temporal average of spike counts. Peak firing rate: The normalized maximum spike counts in a second. Mean vector length: The circular spread of spikes, with 0 and 1 representing strong uniform and non-uniform circular spread, respectively. In Fig. 7d, for each cell, the mean vector length for the short intervals was computed by averaging the vector length values across all 360° head turns in a session. To control for biases introduced by relative differences in spike counts, for each epoch, a corresponding number of spikes was randomly selected from the full session before computing the vector length. Tuning width: The full width at a half max of a cell’s tuning curve. Mutual information: An estimate of the directional information relating the firing rate of a cell to a given direction. The formula96 is given as follows:
where I refers to the information content in bits/spike, Pj is the probability of occupying bin j, λj is the mean firing rate in bin j, and λ is the in-session mean firing rate of the cell. Preferred firing direction (PFD): Similar to Lozano et al.97, PFD was defined as the angular bin with the highest normalized spike counts, whereas the mean PFD refers to the circular mean of angles. In Fig. 1e, stability was assessed based on the circular correlations of mean PFDs of exposure 1 and 2 across all recorded animals. Circular statistics were computed using the astropy module (v5.1) in Python.
Intraclass correlation (ICC)
We used ICC to assess the degree of correlation due to clustering of HD cells within an animal/session vs. across animals/sessions98. In this instance, each session from an animal within a given strain (e.g., WTL, rd1, etc.) is considered as a class where the population ICC metric evaluates the ratio of the between-class variance to the total strain variance with respect to measured metrics based on the formulation below:
where \({\sigma }_{b}^{2}\) denotes the between-class variance and \({\sigma }_{e}^{2}\) denotes the within-class variance.
Here, for each strain, it was used to determine if cells from individual animals/sessions were less homogenous compared to the group. ICC scores normally fall between 0 and 1. ICC = 0 suggests the absence of correlation due to clustering, whereas an ICC of 1 suggests a strong correlation due to clustering, which implies that there is greater homogeneity within cells from individual animals and greater heterogeneity between cells across animals; hence, cells should not be treated as independent samples. Based on published guidelines99, ICC scores below 0.5 are considered acceptable for treating cells across animals as independent samples (related to Supplementary Figs. 2a and 3d). ICC analyses were performed in RStudio (2022.02.1 Build 461) using the ‘ICC’ package100.
Statistical analyses
The statistical test used for every specific comparison is directly stated in the text or figure legend. For hybrid violin/box plots, the violin plot shows the density distribution of all data points, whereas the box-plot shows the median, 25/75% distribution, and 5/95% distribution. All violin plots were scaled to have the same width of 0.6 using matplotlib’s width parameter. For statistical comparisons, *p < 0.05, **p < 0.01.
Reporting summary
Further information on research design is available in the Nature Research Reporting Summary linked to this article.
Data availability
Source data are provided in the supplementary material. All data used for analyses can be found here: https://github.com/kadjitaa/Research/tree/main/HeadDirectionCells/Asumbisa_et_al_2022. Source data are provided with this paper.
Code availability
Custom code can be found here: https://github.com/kadjitaa/Research/tree/main/HeadDirectionCells/Asumbisa_et_al_2022.
References
Tolman, E. C. Cognitive maps in rats and men. Psychol. Rev. 55, 189–208 (1948).
O’Keefe, J. A review of the hippocampal place cells. Prog. Neurobiol. 13, 419–439 (1979).
Klatzky, R. L. Allocentric and Egocentric Spatial Representations: Definitions, Distinctions, and Interconnections. in Spatial Cognition: An Interdisciplinary Approach to Representing and Processing Spatial Knowledge (eds. Freksa, C., Habel, C. & Wender, K. F.) 1–17 (Springer, 1998).
Schinazi, V. R., Thrash, T. & Chebat, D.-R. Spatial navigation by congenitally blind individuals. WIREs Cogn. Sci. 7, 37–58 (2016).
Thinus-Blanc, C. & Gaunet, F. Representation of space in blind persons: vision as a spatial sense? Psychol. Bull. 121, 20–42 (1997).
Pasqualotto, A. & Proulx, M. J. The role of visual experience for the neural basis of spatial cognition. Neurosci. Biobehav. Rev. 36, 1179–1187 (2012).
Moser, E. I., Moser, M.-B. & McNaughton, B. L. Spatial representation in the hippocampal formation: a history. Nat. Neurosci. 20, 1448–1464 (2017).
Hafting, T., Fyhn, M., Molden, S., Moser, M.-B. & Moser, E. I. Microstructure of a spatial map in the entorhinal cortex. Nature 436, 801–806 (2005).
Taube, J. S., Muller, R. U. & Ranck, J. B. Head-direction cells recorded from the postsubiculum in freely moving rats. I. Description and quantitative analysis. J. Neurosci. 10, 420–435 (1990).
Taube, J. S. Head direction cells recorded in the anterior thalamic nuclei of freely moving rats. J. Neurosci. 15, 70–86 (1995).
Peyrache, A., Lacroix, M. M., Petersen, P. C. & Buzsáki, G. Internally organized mechanisms of the head direction sense. Nat. Neurosci. 18, 569–575 (2015).
Taube, J. S. The head direction signal: origins and sensory-motor integration. Annu. Rev. Neurosci. 30, 181–207 (2007).
Wilton, L. A. K., Baird, A. L., Muir, J. L., Honey, R. C. & Aggleton, J. P. Loss of the thalamic nuclei for ‘head direction’ impairs performance on spatial memory tasks in rats. Behav. Neurosci. 115, 861–869 (2001).
Butler, W. N., Smith, K. S., van der Meer, M. A. A. & Taube, J. S. The head-direction signal plays a functional role as a neural compass during navigation. Curr. Biol. CB 27, 1259–1267 (2017).
Harvey, R. E., Thompson, S. M., Sanchez, L. M., Yoder, R. M. & Clark, B. J. Post-training inactivation of the anterior thalamic nuclei impairs spatial performance on the radial arm maze. Front. Neurosci. 11, https://doi.org/10.3389/fnins.2017.00094 (2017).
Winter, S. S., Clark, B. J. & Taube, J. S. Disruption of the head direction cell network impairs the parahippocampal grid cell signal. Science 347, 870–874 (2015).
Harland, B. et al. Lesions of the head direction cell system increase hippocampal place field repetition. Curr. Biol. 27, 2706–2712.e2 (2017).
Skaggs, W. E., Knierim, J. J., Kudrimoti, H. S. & McNaughton, B. L. A model of the neural basis of the rat’s sense of direction. Adv. Neural Inf. Process. Syst. 7, 173–180 (1995).
Chaudhuri, R., Gerçek, B., Pandey, B., Peyrache, A. & Fiete, I. The intrinsic attractor manifold and population dynamics of a canonical cognitive circuit across waking and sleep. Nat. Neurosci. 22, 1512–1520 (2019).
Angelaki, D. E. & Laurens, J. The head direction cell network: attractor dynamics, integration within the navigation system, and three-dimensional properties. Curr. Opin. Neurobiol. 60, 136–144 (2020).
Taube, J. S., Muller, R. U. & Ranck, J. B. Head-direction cells recorded from the postsubiculum in freely moving rats. II. Effects of environmental manipulations. J. Neurosci. 10, 436–447 (1990).
Bassett, J. P., Wills, T. J. & Cacucci, F. Self-organized attractor dynamics in the developing head direction circuit. Curr. Biol. 28, 609–615.e3 (2018).
Stackman, R. W. & Taube, J. S. Firing properties of head direction cells in the rat anterior thalamic nucleus: dependence on vestibular input. J. Neurosci. 17, 4349–4358 (1997).
Stackman, R. W., Clark, A. S. & Taube, J. S. Hippocampal spatial representations require vestibular input. Hippocampus 12, 291–303 (2002).
Muir, G. M. et al. Disruption of the head direction cell signal after occlusion of the semicircular canals in the freely moving chinchilla. J. Neurosci. 29, 14521–14533 (2009).
Blair, H. T. & Sharp, P. E. Visual and vestibular influences on head-direction cells in the anterior thalamus of the rat. Behav. Neurosci. 110, 643–660 (1996).
Chen, G., Manson, D., Cacucci, F. & Wills, T. J. Absence of visual input results in the disruption of grid cell firing in the mouse. Curr. Biol. 26, 2335–2342 (2016).
Dannenberg, H., Lazaro, H., Nambiar, P., Hoyland, A. & Hasselmo, M. E. Effects of visual inputs on neural dynamics for coding of location and running speed in medial entorhinal cortex. eLife 9, e62500 (2020).
Bjerknes, T. L., Langston, R. F., Kruge, I. U., Moser, E. I. & Moser, M.-B. Coherence among head direction cells before eye opening in rat pups. Curr. Biol. 25, 103–108 (2015).
Tan, H. M., Bassett, J. P., O’Keefe, J., Cacucci, F. & Wills, T. J. The development of the head direction system before eye opening in the rat. Curr. Biol. 25, 479–483 (2015).
Goodridge, J. P., Dudchenko, P. A., Worboys, K. A., Golob, E. J. & Taube, J. S. Cue control and head direction cells. Behav. Neurosci. 112, 749–761 (1998).
Stackman, R. W., Golob, E. J., Bassett, J. P. & Taube, J. S. Passive transport disrupts directional path integration by rat head direction cells. J. Neurophysiol. 90, 2862–2874 (2003).
Mizumori, S. J. & Williams, J. D. Directionally selective mnemonic properties of neurons in the lateral dorsal nucleus of the thalamus of rats. J. Neurosci. 13, 4015–4028 (1993).
Yoder, R. M. & Taube, J. S. Head direction cell activity in mice: robust directional signal depends on intact otolith organs. J. Neurosci. 29, 1061–1076 (2009).
Watanabe, S. & Yoshida, M. Auditory cued spatial learning in mice. Physiol. Behav. 92, 906–910 (2007).
Fischler-Ruiz, W. et al. Olfactory landmarks and path integration converge to form a cognitive spatial map. Neuron 109, 4036–4049.e5 (2021).
Poo, C., Agarwal, G., Bonacchi, N. & Mainen, Z. Spatial maps in piriform cortex during olfactory navigation. https://doi.org/10.1101/2020.02.18.935494. (2021).
Long, X. & Zhang, S.-J. A novel somatosensory spatial navigation system outside the hippocampal formation. Cell Res. 31, 649–663 (2021).
Chang, B. et al. Retinal degeneration mutants in the mouse. Vis. Res. 42, 517–525 (2002).
Stasheff, S. F., Shankar, M. & Andrews, M. P. Developmental time course distinguishes changes in spontaneous and light-evoked retinal ganglion cell activity in rd1 and rd10 mice. J. Neurophysiol. 105, 3002–3009 (2011).
Zhang, K., Ginzburg, I., McNaughton, B. L. & Sejnowski, T. J. Interpreting neuronal population activity by reconstruction: unified framework with application to hippocampal place cells. J. Neurophysiol. 79, 1017–1044 (1998).
Johnson, A., Seeland, K. & Redish, A. D. Reconstruction of the postsubiculum head direction signal from neural ensembles. Hippocampus 15, 86–96 (2005).
King, A. J., Hutchings, M. E., Moore, D. R. & Blakemore, C. Developmental plasticity in the visual and auditory representations in the mammalian superior colliculus. Nature 332, 73–76 (1988).
King, A. J. & Carlile, S. Changes induced in the representation of auditory space in the superior colliculus by rearing ferrets with binocular eyelid suture. Exp. Brain Res. 94, 444–455 (1993).
Gordon, J. A. & Stryker, M. P. Experience-dependent plasticity of binocular responses in the primary visual cortex of the mouse. J. Neurosci. 16, 3274–3286 (1996).
Smith, S. L. & Trachtenberg, J. T. Experience-dependent binocular competition in the visual cortex begins at eye opening. Nat. Neurosci. 10, 370–375 (2007).
Stasheff, S. F. Emergence of sustained spontaneous hyperactivity and temporary preservation of OFF responses in ganglion cells of the retinal degeneration (rd1) mouse. J. Neurophysiol. 99, 1408–1421 (2008).
Norwood, J. N. et al. Anatomical basis and physiological role of cerebrospinal fluid transport through the murine cribriform plate. eLife 8, e44278 (2019).
Sievert, T. & Laska, M. Behavioral responses of CD-1 mice to six predator odor components. Chem. Senses 41, 399–406 (2016).
Viejo, G. & Peyrache, A. Precise coupling of the thalamic head-direction system to hippocampal ripples. Nat. Commun. 11, 2524 (2020).
Chen, T. & Guestrin, C. XGBoost: A Scalable Tree Boosting System. in Proc. 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 785–794 (ACM, 2016).
Tenenbaum, J. B., Silva, Vde & Langford, J. C. A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000).
He, K. et al. Echolocation in soft-furred tree mice. Science 372, eaay1513 (2021).
Agster, K. L. & Burwell, R. D. Cortical efferents of the perirhinal, postrhinal, and entorhinal cortices of the rat. Hippocampus 19, 1159–1186 (2009).
Chapuis, J. et al. Lateral entorhinal modulation of piriform cortical activity and fine odor discrimination. J. Neurosci. 33, 13449–13459 (2013).
van Groen, T. & Wyss, J. M. The postsubicular cortex in the rat: characterization of the fourth region of the subicular cortex and its connections. Brain Res. 529, 165–177 (1990).
Touj, S. et al. Better olfactory performance and larger olfactory bulbs in a mouse model of congenital blindness. Chem. Senses 45, 523–531 (2020).
Maaswinkel, H. & Whishaw, I. Q. Homing with locale, taxon, and dead reckoning strategies by foraging rats: sensory hierarchy in spatial navigation. Behav. Brain Res. 99, 143–152 (1999).
Save, E., Cressant, A., Thinus-Blanc, C. & Poucet, B. Spatial firing of hippocampal place cells in blind rats. J. Neurosci. 18, 1818–1826 (1998).
Save, E., Nerad, L. & Poucet, B. Contribution of multiple sensory information to place field stability in hippocampal place cells. Hippocampus 10, 64–76 (2000).
Wallace, D. G., Gorny, B. & Whishaw, I. Q. Rats can track odors, other rats, and themselves: implications for the study of spatial behavior. Behav. Brain Res. 131, 185–192 (2002).
Rajan, R., Clement, J. P. & Bhalla, U. S. Rats smell in stereo. Science 311, 666–670 (2006).
Kepecs, A., Uchida, N. & Mainen, Z. F. The sniff as a unit of olfactory processing. Chem. Senses 31, 167–179 (2006).
Uchida, N. & Mainen, Z. F. Speed and accuracy of olfactory discrimination in the rat. Nat. Neurosci. 6, 1224–1229 (2003).
Kepecs, A., Uchida, N. & Mainen, Z. F. Rapid and precise control of sniffing during olfactory discrimination in rats. J. Neurophysiol. 98, 205–213 (2007).
Wiesel, T. N. & Hubel, D. H. Single-cell responses in striate cortex of kittens deprived of vision in one eye. J. Neurophysiol. 26, 1003–1017 (1963).
Berardi, N., Pizzorusso, T. & Maffei, L. Critical periods during sensory development. Curr. Opin. Neurobiol. 10, 138–145 (2000).
Wallace, M. T. & Stein, B. E. Early experience determines how the senses will interact. J. Neurophysiol. 97, 921–926 (2007).
Loomis, J. M. et al. Nonvisual navigation by blind and sighted: assessment of path integration ability. J. Exp. Psychol. Gen. 122, 73–91 (1993).
Tinti, C., Adenzato, M., Tamietto, M. & Cornoldi, C. Visual experience is not necessary for efficient survey spatial cognition: evidence from blindness. Q. J. Exp. Psychol. 59, 1306–1328 (2006).
Lessard, N., Paré, M., Lepore, F. & Lassonde, M. Early-blind human subjects localize sound sources better than sighted subjects. Nature 395, 278–280 (1998).
Kellogg, W. N. Sonar system of the blind. Science 137, 399–404 (1962).
Stroffregen, T. A. & Pittenger, J. B. Human echolocation as a basic form of perception and action. Ecol. Psychol. 7, 181–216 (1995).
Milne, J. L., Goodale, M. A. & Thaler, L. The role of head movements in the discrimination of 2-D shape by blind echolocation experts. Atten. Percept. Psychophys. 76, 1828–1837 (2014).
Porter, J. et al. Mechanisms of scent-tracking in humans. Nat. Neurosci. 10, 27–29 (2007).
Jacobs, L. F., Arter, J., Cook, A. & Sulloway, F. J. Olfactory orientation and navigation in humans. PLoS ONE 10, e0129387 (2015).
Hamburger, K. & Knauff, M. Odors can serve as landmarks in human wayfinding. Cogn. Sci. 43, e12798 (2019).
Wu, Y., Chen, K., Ye, Y., Zhang, T. & Zhou, W. Humans navigate with stereo olfaction. Proc. Natl Acad. Sci. USA 117, 16065–16071 (2020).
Koutsoklenis, A. & Papadopoulos, K. Olfactory cues used for wayfinding in urban environments by individuals with visual impairments. J. Vis. Impair. Blind. 105, 692–702 (2011).
Cuevas, I., Plaza, P., Rombaux, P., De Volder, A. G. & Renier, L. Odour discrimination and identification are improved in early blindness. Neuropsychologia 47, 3079–3083 (2009).
Beaulieu-Lefebvre, M., Schneider, F. C., Kupers, R. & Ptito, M. Odor perception and odor awareness in congenital blindness. Brain Res. Bull. 84, 206–209 (2011).
Renier, L. et al. Right occipital cortex activation correlates with superior odor processing performance in the early blind. PLoS ONE 8, e71907 (2013).
Gagnon, L., Ismaili, A. R. A., Ptito, M. & Kupers, R. Superior orthonasal but not retronasal olfactory skills in congenital blindness. PLoS ONE 10, e0122567 (2015).
Rieser, J. J., Lockman, J. J. & Pick, H. L. The role of visual experience in knowledge of spatial layout. Percept. Psychophys. 28, 185–190 (1980).
Herman, J. F., Chatman, S. P. & Roth, S. F. Cognitive mapping in blind people: acquisition of spatial relationships in a large-scale environment. J. Vis. Impair. Blind. 77, 161–166 (1983).
Rieser, J. J., Guth, D. A. & Hill, E. W. Sensitivity to perspective structure while walking without vision. Perception 15, 173–188 (1986).
Pasqualotto, A. & Newell, F. N. The role of visual experience on the representation and updating of novel haptic scenes. Brain Cogn. 65, 184–194 (2007).
Hillier, D. et al. Causal evidence for retina dependent and independent visual motion computations in mouse cortex. Nat. Neurosci. 20, 960–968 (2017).
Pachitariu, M., Steinmetz, N., Kadir, S., Carandini, M. & Kenneth, D. H. Kilosort: realtime spike-sorting for extracellular electrophysiology with hundreds of channels. bioRxiv https://doi.org/10.1101/061481 (2016).
Hazan, L., Zugaro, M. & Buzsáki, G. Klusters, NeuroScope, NDManager: a free software suite for neurophysiological data processing and visualization. J. Neurosci. Methods 155, 207–216 (2006).
Nikbakht, N. & Diamond, M. E. Conserved visual capacity of rats under red light. eLife 10, e66429 (2021).
McBride, K. Does intranasal application of zinc sulfate produce anosmia in the mouse? an olfactometric and anatomical study. Chem. Senses 28, 659–670 (2003).
Pedregosa, F. et al. Scikit-learn: machine learning in python. J. Mach. Learn. Res. 12, 2825–2830 (2011).
Jankowski, M. M. et al. The anterior thalamus provides a subcortical circuit supporting memory and spatial navigation. Front. Syst. Neurosci. 7, https://doi.org/10.3389/fnsys.2013.00045 (2013).
Tsanov, M. et al. Theta-modulated head direction cells in the rat anterior thalamus. J. Neurosci. 31, 9489–9502 (2011).
Skaggs, W. E., McNaughton, B. L., Wilson, M. A. & Barnes, C. A. Theta phase precession in hippocampal neuronal populations and the compression of temporal sequences. Hippocampus 6, 149–172 (1996).
Lozano, Y. R. et al. Retrosplenial and postsubicular head direction cells compared during visual landmark discrimination. Brain Neurosci. Adv. 1, 2398212817721859 (2017).
Yu, Z. et al. Beyond t test and ANOVA: applications of mixed-effects models for more rigorous statistical analysis in neuroscience research. Neuron 110, 21–35 (2022).
Koo, T. K. & Li, M. Y. A guideline of selecting and reporting intraclass correlation coefficients for reliability research. J. Chiropr. Med. 15, 155–163 (2016).
Wolak, M. E., Fairbairn, D. J. & Paulsen, Y. R. Guidelines for estimating repeatability. Methods Ecol. Evol. 3, 129–137 (2012).
Acknowledgements
We thank R. Tong and G. Viejo for critical discussions on this project. We thank A. Villemain for maintaining mouse colonies. We acknowledge the following funding sources: Jean Timmins Costello fellowship and Healthy Brains for Healthy Lives fellowship to K.A.; Canada Research Chairs to A.P. and S.T.; Alfred P. Sloan Foundation Research Fellowship, Vision Health Research Network Pilot Project for Early-Career Investigators Grant, and a Canadian Institutes of Health Research Project Grant to S.T.
Author information
Authors and Affiliations
Contributions
Experiments were designed by K.A., A.P., and S.T. Experiments and analyses were performed by K.A. Figures were generated by K.A. The paper was written by K.A. and S.T.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Nature Communications thanks Etienne Save and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Source data
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Asumbisa, K., Peyrache, A. & Trenholm, S. Flexible cue anchoring strategies enable stable head direction coding in both sighted and blind animals. Nat Commun 13, 5483 (2022). https://doi.org/10.1038/s41467-022-33204-0
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41467-022-33204-0
This article is cited by
-
Sensory and behavioral modulation of thalamic head-direction cells
Nature Neuroscience (2024)
-
Coregistration of heading to visual cues in retrosplenial cortex
Nature Communications (2023)
-
Attractor and integrator networks in the brain
Nature Reviews Neuroscience (2022)
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.