Language articles within Nature Communications

Featured

  • Article
    | Open Access

    How speech sounds come to be understood as language remains unclear. Here, the authors find that brain responses to speech in part reflect abstraction of phonological units specific to the language being spoken, mediated through relationships between acoustic features.

    • Anna Mai
    • , Stephanie Riès
    •  & Timothy Q. Gentner
  • Article
    | Open Access

    Reconstructing language dispersal patterns is important for understanding cultural spread and demic diffusion. Here, the authors use a computational approach based on velocity field estimation to infer the dispersal patterns of Indo-European, Sino-Tibetan, Bantu, and Arawak language families.

    • Sizhe Yang
    • , Xiaoru Sun
    •  & Menghan Zhang
  • Article
    | Open Access

    To understand speech, our brains have to learn the different types of sounds that constitute words, including syllables, stress patterns and smaller sound elements, such as phonetic categories. Here, the authors provide evidence that at 7 months, the infant brain learns reliably to detect invariant phonetic categories.

    • Giovanni M. Di Liberto
    • , Adam Attaheri
    •  & Usha Goswami
  • Article
    | Open Access

    In tonal languages, modulation of pitch distinguishes words with different meaning. Here the authors investigate neural mechanisms of pitch control during lexical tone production in Mandarin-speaking participants.

    • Junfeng Lu
    • , Yuanning Li
    •  & Edward F. Chang
  • Article
    | Open Access

    Limitations of spatiotemporal resolution have rendered it difficult to isolate language. Here, intracranial recordings were used to map semantic processes pertaining to sentence integration, unveiling complementary roles for frontotemporal brain regions.

    • Elliot Murphy
    • , Kiefer J. Forseth
    •  & Nitin Tandon
  • Article
    | Open Access

    The neural dynamics underlying speech comprehension are not well understood. Here, the authors show that phonemic-to-lexical processing is localized to a large region of the temporal cortex, and that segmentation of the speech stream occurs mostly at the level of diphones.

    • Xue L. Gong
    • , Alexander G. Huth
    •  & Frédéric E. Theunissen
  • Article
    | Open Access

    Grammar learning requires memory for temporally organised, rule-based patterns in speech. Here, the authors use event-related potentials to show that 6 to 8 month-old infants can form memory of dependencies between nonadjacent elements in sentences of an unknown language, regardless of whether they nap or stay awake after encoding.

    • Manuela Friedrich
    • , Matthias Mölle
    •  & Angela D. Friederici
  • Article
    | Open Access

    Speech unfolds faster than the brain completes processing of speech sounds. Here, the authors show that brain activity moves systematically within neural populations of auditory cortex, allowing accurate representation of a speech sound’s identity and its position in the sound sequence.

    • Laura Gwilliams
    • , Jean-Remi King
    •  & David Poeppel
  • Article
    | Open Access

    Reconstructing imagined speech from neural activity holds great promises for people with severe speech production deficits. Here, the authors demonstrate using human intracranial recordings that both low- and higher-frequency power and local cross-frequency contribute to imagined speech decoding.

    • Timothée Proix
    • , Jaime Delgado Saa
    •  & Anne-Lise Giraud
  • Article
    | Open Access

    When asked to imagine an event such as a party, individuals will vary in their mental imagery based on their specific experience of parties. Here, the authors show that such signatures of personal experience can be read from brain activity elicited as events are imagined.

    • Andrew James Anderson
    • , Kelsey McDermott
    •  & Feng V. Lin
  • Article
    | Open Access

    The human brain fluently parses continuous speech during perception and production. Using direct brain recordings coupled with stimulation, the authors identify separable substrates underlying two distinct predictive mechanisms of “when” in Heschl’s gyrus and “what” in planum temporale.

    • K. J. Forseth
    • , G. Hickok
    •  & N. Tandon
  • Article
    | Open Access

    Semantic dementia patients present with a core semantic impairment and variations of language, behavioural and face recognition abilities. Here, the authors build a unified multidimensional model to capture all these graded symptoms and map them to the variations in the patients’ frontotemporal atrophy.

    • Junhua Ding
    • , Keliang Chen
    •  & Matthew. A. Lambon Ralph
  • Article
    | Open Access

    The visual word form area (VWFA) is a brain region associated with written language, but it has also been linked to visuospatial attention. Here, the authors reveal distinct structural and functional circuits linking VWFA with language and attention networks, and demonstrate that these circuits separately predict language and attention abilities.

    • Lang Chen
    • , Demian Wassermann
    •  & Vinod Menon
  • Article
    | Open Access

    We can recognize an object from one of its features, e.g. hearing a bark leads us to think of a dog. Here, the authors show using fMRI that the brain combines bits of information into object representations, and that presenting a few features of an object activates representations of its other attributes.

    • Sasa L. Kivisaari
    • , Marijn van Vliet
    •  & Riitta Salmelin
  • Article
    | Open Access

    How are abstract, imperceptible concepts such as ‘freedom’ represented in the brain? Here, the authors use fMRI in people born blind to compare the neural responses for abstract concepts, concrete concepts like ‘rainbow’ which in blind people lack sensory qualities, and concrete concepts sensorily accessible to the blind.

    • Ella Striem-Amit
    • , Xiaoying Wang
    •  & Alfonso Caramazza
  • Article
    | Open Access

    Functional morphemes allow us to express details about objects, events, and their relationships. Here, authors show that inhibiting a small cortical area within left posterior superior temporal lobe selectively impairs the ability to produce functional morphemes but does not impair other linguistic abilities.

    • Daniel K. Lee
    • , Evelina Fedorenko
    •  & Ziv M. Williams
  • Article
    | Open Access

    The role of frontal lobes in speech perception is controversial. Here, the authors show that neurodegeneration of frontal speech regions delays prediction reconciliation in temporal cortex and results in inflexible prior expectations, indicating that fronto-temporal interactions determine predictive processes in speech.

    • Thomas E. Cope
    • , E. Sohoglu
    •  & James B. Rowe
  • Article
    | Open Access

    We can often ‘fill in’ missing or occluded sounds from a speech signal—an effect known as phoneme restoration. Leonard et al. found a real-time restoration of the missing sounds in the superior temporal auditory cortex in humans. Interestingly, neural activity in frontal regions prior to the stimulus can predict the word that the participant would later hear.

    • Matthew K. Leonard
    • , Maxime O. Baud
    •  & Edward F. Chang
  • Article
    | Open Access

    Whether brief early exposure to a language affects future language processing is unclear. Here Pierce et al.show that brain activity evoked by French pseudowords in monolingual French speaking Chinese adoptees is different from French children with no exposure to Chinese and similar to bilingual Chinese children.

    • Lara J. Pierce
    • , Jen-Kai Chen
    •  & Denise Klein
  • Article
    | Open Access

    This study uses functional magnetic resonance imaging in humans and monkeys to show similar ventral frontal and opercular cortical responses when processing sequences of auditory nonsense words. The study indicates that this frontal region is involved in evaluating the order of incoming sounds in a sequence, a process that may be conserved in primates.

    • Benjamin Wilson
    • , Yukiko Kikuchi
    •  & Christopher I. Petkov
  • Article |

    Contemporary neuroimaging techniques are enabling precise analysis of structure–function relations in the brain. This study combines large-scale structural neuroimaging and behavioural analyses in patients with acquired aphasia to elucidate the neural organization of spoken language processing.

    • Daniel Mirman
    • , Qi Chen
    •  & Myrna F. Schwartz
  • Article |

    Communicative persistence is a key indicator of intentionality in humans. Here Roberts et al. show that two language-trained chimpanzees can dynamically and flexibly use persistent intentional communication to guide a naive experimenter to a food item hidden in a large outdoor enclosure.

    • Anna Ilona Roberts
    • , Sarah-Jane Vick
    •  & Charles R. Menzel
  • Article
    | Open Access

    Bilingual infants possess a unique ability to rapidly acquire the grammar of both of their native languages. Gervain and Werker find that bilingual infants achieve this by using characteristic prosodic cues associated with different word orders.

    • Judit Gervain
    •  & Janet F. Werker
  • Article |

    The exact speed of spoken word processing by our brain is still unknown. Using MEG to compare brain responses to words and pseudowords, MacGregoret al. show that lexical processing occurs 50 ms after acoustic information is presented, suggesting that our brain's access to word information is near-instantaneous.

    • Lucy J MacGregor
    • , Friedemann Pulvermüller
    •  & Yury Shtyrov