Brain control of a helping hand

Paralysed patients would benefit if their thoughts could become everyday actions. The demonstration that monkeys can use brain activity for precise control of an arm-like robot is a step towards that end.

Strokes, spinal-cord injuries and degenerative neuromuscular disease all cause damage that can severely compromise the ability of patients to use their muscles. The loss of mobility and independence that results from such motor deficits takes a devastating toll on their quality of life. Medical research is striving on many fronts to reverse the disease or injury state of such patients. Meanwhile, other approaches are needed to enhance their quality of life. Research described by Velliste et al.1, published on page 1098 of this issue, provides a heartening example of what, in due course, may be possibleFootnote 1.

Often, the patient's condition leaves intact parts of the cerebral cortex involved in voluntary motor control, including the primary motor cortex, premotor cortex and posterior parietal cortex. These patients are still able to produce the brain activity that would normally result in voluntary movements, but their condition prevents those signals from either getting to the muscles or activating them adequately. In such cases, one possible solution is to let the subjects think about what they would like to do as if they were mentally rehearsing the desired actions, record the resulting brain activity, and use those signals to control a robotic device. The development of such brain–machine interfaces (BMIs), or neuroprosthetic controllers, is being pursued in several laboratories.

Velliste et al.1 report one of the latest advances in this field. Using grids of fine electrodes implanted in the primary motor cortex of monkeys, they trained the animals to generate patterns of brain activity to control an anthropomorphic robot arm that had a shoulder joint, an elbow joint and a claw-like gripper 'hand'. The animal sat with its arms gently restrained at its side, with the robot arm positioned next to its shoulder (see Fig. 1a of the paper1). Remarkably, within a few days, the monkeys were able to make the robot reach out to a tasty treat such as a piece of fruit, stop, close the gripper on the treat, remove it from a small peg, bring the treat-laden gripper back to their mouth and open the gripper to eat the treat, all in one natural-looking motion.

This is the first reported demonstration of the use of BMI technology by subjects to perform a practical behavioural act — feeding themselves — via brain control of the motion of a robotic arm in three-dimensional space. It represents the current state of the art in the development of neuroprosthetic controllers for complex arm-like robots that could one day, in principle, help patients perform many everyday tasks such as eating, drinking from a glass or using a tool.

One encouraging finding was how readily the monkeys learned to control the robot. Velliste et al.1 used standard operant conditioning methods, in which each successful reach, grasp and retrieval was reinforced by consumption of the food reward. The initial training period was assisted by corrective signals generated by the BMI control program, but the monkeys quickly learned how to generate the brain activity that would produce the desired robot motions without any assistance. Learning could be even quicker in human subjects, facilitated by verbal instructions from a trainer. This also suggests that neuroprosthetic devices could minimize the frustration often felt by patients in current rehabilitation programmes when their diminished motor capacity results in only small performance gains despite prolonged, intense effort.

Equally encouraging was how naturally the monkeys controlled and interacted with the robot. They made curved trajectories of the gripper through space to avoid obstacles, made rapid corrections in the trajectory when the experimenter unexpectedly changed the location of the food morsel, and even used the gripper as a prop to push a loose treat from their lips into their mouth. The monkeys evidently adapted to the robot as a natural surrogate for their own immobile arm. Previous findings indicated that when monkeys learn to use a tool, it becomes incorporated into their own internal body image2. Patients who use neuroprosthetic devices for any period of time may come to regard them as natural extensions of their own body, because they can control them efficiently and relatively effortlessly through their own thought processes. This bodes well for the long-term psychological well-being of patients who must depend on such technology.

Velliste et al.1 have provided a promising further proof-of-concept of the potential of BMI technology to help neurological patients. However, we should not get carried away and leap to the conclusion that neuroprosthetic robots will soon be available at the local rehabilitation clinic. All of the main technology employed by Velliste et al. to use brain activity to control an arm-like robot has already been demonstrated with simpler remote devices, first in experimental animals3,4,5,6,7, and recently in human clinical subjects8. They offer no fundamental conceptual or technical advances to surmount several hurdles that must still be overcome to permit the widespread clinical application of neuroprosthetic control technology.

For example, the long-term reliability of the implantable electrodes must be improved. Patients will need to use this technology for many years, but the quality of the recorded neural activity often deteriorates within weeks or months. Furthermore, the success of neuroprosthetic control has been confined so far to the laboratory environment, because the current technology involves a sizeable array of relatively immobile recording, computer and robotic control hardware whose operation also requires the constant attention of a skilled technician. Much work remains to be done if neuroprosthetic controllers are to become portable and largely autonomous.

Moreover, to date, subjects have used only visual feedback to control remote devices. For physical interactions with the environment, the subjects must also be able to sense and control the forces exerted by the robot on any object or surface — so that, for instance, they can pick up an object with a strong enough grip to prevent it slipping from the robotic hand but not so strong as to crush it. This vital information is normally provided by sensory receptors in the skin, muscles and joints. The robots must be equipped with equivalent sensors, and some efficient means must be developed to deliver this sensory feedback9 to the patients. These and other technical issues are challenging, but not insurmountable.

Like many previous BMI studies3,4,5,6,7, Velliste et al.1 recorded from the primary motor cortex. Other studies have extracted potential control signals from the premotor cortex and the posterior parietal cortex7,10,11,12. Signals from each of these brain regions have unique properties that may make them particularly useful for different aspects of voluntary behaviour. This may lead to the eventual development of 'intelligent' neuroprosthetic controllers. Such controllers would allow patients with severe motor deficits to interact and communicate with the world not only by the moment-to-moment control of the motion of robotic devices, but also in a more natural and intuitive manner that reflects their overall goals, needs and preferences10,11.


  1. 1.

    *This News & Views article and the paper concerned1 were published online on 28 May 2008.


  1. 1

    Velliste, M., Perel, S., Spalding, M. C., Whitford, A. S. & Schwartz, A. B. Nature 453, 1098–1101 (2008).

  2. 2

    Maravita, A. & Iriki, A. Trends Cogn. Sci. 8, 79–86 (2004).

  3. 3

    Chapin, J. K., Moxon, K. A., Markowitz, R. S. & Nicolelis, M. A. L. Nature Neurosci. 2, 664–670 (1999).

  4. 4

    Wessberg, J. et al. Nature 408, 361–365 (2000).

  5. 5

    Serruya, M. D., Hatsopoulos, N. G., Paninski, L., Fellows, M. R. & Donoghue, J. P. Nature 416, 141–142 (2002).

  6. 6

    Taylor, D. M., Tillery, S. I. & Schwartz, A. B. Science 296, 1829–1832 (2002).

  7. 7

    Carmena, J. M. et al. PLoS Biol. 1, e42 (2003).

  8. 8

    Hochberg, L. R. et al. Nature 442, 164–171 (2006).

  9. 9

    London, B. M., Jordan, L. R., Jackson, C. R. & Miller, L. E. IEEE Trans. Neural Syst. Rehabil. Eng. 16, 32–36 (2008).

  10. 10

    Musallam, S., Corneil, B. D., Greger, B., Scherberger, H. & Andersen, R. A. Science 305, 258–262 (2004).

  11. 11

    Hatsopoulos, N., Joshi, J. & O'Leary, J. G. J. Neurophysiol. 92, 1165–1174 (2004).

  12. 12

    Santucci, D. M., Kralik, J. D., Lebedev, M. A. & Nicolelis, M. A. Eur. J. Neurosci. 22, 1529–1540 (2005).

Download references

Author information

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Kalaska, J. Brain control of a helping hand. Nature 453, 994–995 (2008) doi:10.1038/nature06366

Download citation

Further reading


By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.