We have known for more than a century that the left hemisphere is the primary site for language processing in the brain because lesions of the left hemisphere have often resulted in disorders of speech and language. But it is less clear why the left hemisphere is dominant for language. Is it because the left side of the brain is dedicated to processing the motor aspects of speaking and the sensory aspects of hearing or is it because the left side of the brain is specifically involved in processing the linguistic patterns of language? Clearly, it will be extremely difficult to answer this question by studying speech-based languages alone. One alternative is to study languages that are not speech-based but still possess linguistic structure — so dissociating the processing of speech from the processing of the structure of language. Languages that are not speech-based include the naturally evolved sign languages of deaf people such as American Sign Language. These languages possess phonological, morphological and syntactic levels of language organization homologous to those in spoken languages, convey the full semantic and grammatical expressive range, and use similar conversational rules to spoken languages.

A recent study by Laura Petitto, Robert Zatorre and colleagues at McGill University built on previous work with naturally evolved sign languages and their use by profoundly deaf subjects to probe the neural basis of language organization using positron emission tomography. Cerebral blood flow activity was observed in profoundly deaf subjects processing specific aspects of sign language in areas of the brain that are widely assumed to be unimodal for speech or sound. Specifically, activity was observed in the left inferior frontal cortex when deaf signers produced meaningful signed verbs in response to a signed noun. These results suggest that specific sites of the left frontal cortex are recruited for higher-order linguistic processes related to lexical operations that do not depend on the presence of sound. In addition, activity was observed bilaterally in an area of the superior temporal gyrus (STG) — the planum temporale (PT) — when deaf subjects viewed signs or meaningless parts of signs (equivalent to phonetic or syllabic units). The latter result indicates that, contrary to the prevailing view, the PT might not be exclusively dedicated to processing speech sounds. It might instead have a more general role in processing the abstract properties of language in multiple modalities. It could also be that auditory cortex within the STG undergoes functional reorganization in the absence of auditory input to respond to complex visual inputs more generally. These findings therefore raise many interesting questions concerning the functional role of STG regions in the deaf.

These new data indicate that the specialization for language is not pre-specified exclusively by the mechanisms for producing and perceiving sound but might also involve multi-modal areas that are specialized for processing the patterning of natural languages. It seems very clear that sign languages might, somewhat counter-intuitively, hold some of the keys to understanding the neural basis of human language.