Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
A brain–machine interface (BMI) is a device that translates neuronal information into commands capable of controlling external software or hardware such as a computer or robotic arm. BMIs are often used as assisted living devices for individuals with motor or sensory impairments.
Multilingual articulatory representations in the speech-motor cortex of a participant with vocal-tract and limb paralysis enabled the development of a bilingual speech neuroprosthesis.
Wandelt et al. describe a brain–machine interface that captures intracortical neural activity during internal speech (words said within the mind with no associated movement or audio output) and translates those cortical signals into real-time text.
A study using intracranial recordings in humans suggests that upcoming hand movements can be predicted by oscillatory brain features of the LFP signal.
Use of a robotic balance simulator demonstrates that humans can learn to balance with long sensorimotor delays in different contexts (movement direction, muscle effectors) and generalize learned control to untrained contexts.
Challenging long-held assumptions, this research reveals that people can learn to control bionic hands just as effectively, and in some ways better, using arbitrary control strategies compared with control strategies that mimic the human body.
We show that nonlinear latent factors and structures in neural population activity can be modelled in a manner that allows for flexible dynamical inference, causally, non-causally and in the presence of missing neural observations. Further, the developed neural network model improves the prediction of neural activity, behaviour and latent neural structures.