Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Research Briefing
  • Published:

Brain–machine-interface device translates internal speech into text

For patients affected by speech disorders, brain–machine-interface (BMI) devices could restore their ability to verbally communicate. In this work, we captured neural activity associated with internal speech — words said within the mind with no associated movement or audio output — and translated these cortical signals into text in real time.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Internal and vocalized speech are significantly decodable both offline and online.

References

  1. Makin, J. G., Moses, D. A. & Chang, E. F. Machine translation of cortical activity to text with an encoder–decoder framework. Nat. Neurosci. 23, 575–582 (2020). An article that presents high-accuracy vocalized speech decoding into text using electrocorticogram data.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  2. Moses, D. A. et al. Neuroprosthesis for decoding speech in a paralyzed person with anarthria. N. Engl. J. Med. 385, 217–227 (2021). An article that presents high-accuracy attempted speech decoding into text using electrocorticogram data.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Willett, F. R. et al. A high-performance speech neuroprosthesis. Nature 620, 1031–1036 (2023). An article that presents high-accuracy attempted and mimed speech decoding into text using multielectrode arrays.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  4. Metzger, S. L. et al. A high-performance neuroprosthesis for speech decoding and avatar control. Nature 620, 1037–1046 (2023). An article that presents high-accuracy attempted and mimed speech decoding into text, audio and avatar movement using electrocorticogram data.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  5. Wandelt, S. K. et al. Decoding grasp and speech signals from the cortical grasp circuit in a tetraplegic human. Neuron 110, 1777–1787 (2022). An article that presents grasp and vocalized speech decoding from the SMG.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

Download references

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This is a summary of: Wandelt, S. K. et al. Representation of internal speech by single neurons in human supramarginal gyrus. Nat. Hum. Behav. https://doi.org/10.1038/s41562-024-01867-y (2024).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Brain–machine-interface device translates internal speech into text. Nat Hum Behav 8, 1014–1015 (2024). https://doi.org/10.1038/s41562-024-01869-w

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41562-024-01869-w

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing