Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Sign language ‘heard’ in the auditory cortex


The upper regions of the brain's temporal lobe are important both for hearing and for comprehending spoken language. We have discovered that these regions can be activated by sign language in congenitally deaf subjects, even though the temporal lobe normally functions as an auditory area. This finding indicates that, in deaf people, the brain region usually reserved for hearing may be activated by other sensory modalities, providing striking evidence of neural plasticity.


The auditory areas consist of the primary auditory cortex and the auditory association area (the supratemporal gyrus). The neural network that projects from the inner ear to the primary auditory cerebral cortex is formed without any auditory input, whereas post-processing neurons develop by learning with proper neural input. The learning period for the mother tongue is thought to be below five to six years of age1. Reducing the auditory signals during the critical language-learning period can severely limit a child's potential for developing an effective communication system2. ‘Pre-lingual deaf’ patients, who were deafened before acquiring language, communicate using sign language.

In an attempt to understand how these auditory areas function in the congenitally deaf, we used positron emission tomography (PET) to measure cortical activation during a sign-language task. In the main experiment we sought to localize the ‘sign language’ areas, but a secondary experiment was set up to localize both the auditory areas that had been dormant and the visual areas.

In the main experiment, the subject viewed a video of sign-language words being signed by a native signer; a still frame of the video was viewed in the control task. PET images were seen using statistical parametric mapping software3, and maps were superimposed onto magnetic resonance images of the subject's brain for spatial localization. We found that sign language activated the supratemporal gyri bilaterally (left, z=4.52; P=0.005, corrected; Fig. 1).

Figure 1: Activation of areas of the brain.
figure 1

The activated areas were superimposed onto the three horizontal sections (10 mm below, and 4 mm and 8 mm above, the intercommissural plane) of the subject's magnetic resonance image. Yellow areas were activated by sign language in the main experiment; green areas were activated by audition, and blue areas by vision, in the secondary experiment.

The subject was scheduled to have a cochlear implant in his left ear. The implant is an artificial prosthesis, inserted into the inner ear, that electrically stimulates the cochlear nerve and enables the profoundly deaf to hear sounds. To distinguish the supratemporal gyri from the visual and dormant auditory areas, a secondary experiment was performed after the operation, consisting of an auditory task, a visual task and rest. In the visual task, the subject watched a video showing someone moving both hands up and down in a meaningless manner. In the auditory task, the words of the tape were delivered through the cochlear implant. The visual stimulation was found to activate the visual cortex in the occipital lobe (P=0.001, corrected), and the auditory stimulation activated the right primary auditory cortex, contralateral to the auditory input (P=0.002, uncorrected) (Fig. 1).

Pre-lingual deaf people can hear when a cochlear implant is switched on, but this does not allow them to understand words. Language stimulation through the implant activates only the primary auditory cortex in the pre-lingual deaf, whereas in the post-lingual deaf it activates both the primary and the secondary auditory areas4. The result of our secondary experiment was compatible with these findings. Our study of native signers and those who learnt sign language later showed that the nature and timing of sensory and linguistic experience significantly affect the development of the language systems of the brain5.

In bilingual subjects (those with both signed and spoken language), sign language activates the visual areas6, whereas our study showed activation of the auditory area in the sign-language task. Because our subject had never received auditory input while the neural network was being formed, it seems that the supratemporal lobe was engaged in processing sign language. Using sign language elicits considerable activation of the left hemisphere in Broca's area and Wernicke's area, as well as of the right hemisphere7, whereas our results indicated limited activation of Wernicke's area by sign-language words.

This cross-modal plasticity is also seen in visual areas. Braille-reading blind subjects have activation of the primary and secondary visual cortical areas when they perform tactile tasks8, although congenitally blind Braille readers have activation of visual reading areas but not primary visual cortex9. Our results indicate that the primary auditory cortex of deaf people is reserved for hearing sounds, whereas the secondary areas are used for processing sign language. This cross-modal non-plasticity of the primary auditory cortex is supported by functional magnetic resonance imaging of a congenitally deaf subject10, which suggests that the primary projection areas might be rigidly organized.

We observed that sign language activates the ‘language’ areas but not primary auditory cortex. The finding that, after a cochlear implant is in place, spoken words activate primary auditory cortex but not adjacent language areas indicates that primary auditory cortex still functions as an auditory area in this patient. We also identified the ‘sign-language’ area as the supratemporal gyri, which is usually the auditory area.


  1. Osberger, M. al. Ann. Otol. Rhinol. Laryngol. 100, 883–888 (1991).

    Google Scholar 

  2. Fitch, J. L., Williams, T. F. & Etienne, J. E. J. Speech Hear. Disord. 47, 373–375 (1982).

    Google Scholar 

  3. SPM96 (Wellcome Department of Cognitive Neurology, London, 1996).

  4. Naito, al. Acta Otolaryngol. 117, 490–496 (1997).

    Google Scholar 

  5. Neville, H. al. Brain Lang. 57, 285–308 (1997).

    Google Scholar 

  6. Soderfeldt, al. Neurology 49, 82–87 (1997).

    Google Scholar 

  7. Neville, H. al. Proc. Natl Acad. Sci. USA 95, 922–929 (1998).

    Google Scholar 

  8. Sadato, al. Nature 380, 526–528 (1996).

    ADS  CAS  Article  Google Scholar 

  9. Büchel, C., Price, C., Frachowiak, R. S. J. & Friston, K. Brain 121, 409–419 (1998).

    Google Scholar 

  10. Hickok, al. Hum. Brain Mapping 5, 437–444 (1997).

    Google Scholar 

Download references

Author information

Authors and Affiliations


Rights and permissions

Reprints and Permissions

About this article

Cite this article

Nishimura, H., Hashikawa, K., Doi, K. et al. Sign language ‘heard’ in the auditory cortex. Nature 397, 116 (1999).

Download citation

  • Issue Date:

  • DOI:

Further reading


By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.


Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing