Letter | Published:

Cortical control of a prosthetic arm for self-feeding

Nature volume 453, pages 10981101 (19 June 2008) | Download Citation

Subjects

Abstract

Arm movement is well represented in populations of neurons recorded from the motor cortex1,2,3,4,5,6,7. Cortical activity patterns have been used in the new field of brain–machine interfaces8,9,10,11 to show how cursors on computer displays can be moved in two- and three-dimensional space12,13,14,15,16,17,18,19,20,21,22. Although the ability to move a cursor can be useful in its own right, this technology could be applied to restore arm and hand function for amputees and paralysed persons. However, the use of cortical signals to control a multi-jointed prosthetic device for direct real-time interaction with the physical environment (‘embodiment’) has not been demonstrated. Here we describe a system that permits embodied prosthetic control; we show how monkeys (Macaca mulatta) use their motor cortical activity to control a mechanized arm replica in a self-feeding task. In addition to the three dimensions of movement, the subjects’ cortical signals also proportionally controlled a gripper on the end of the arm. Owing to the physical interaction between the monkey, the robotic arm and objects in the workspace, this new task presented a higher level of difficulty than previous virtual (cursor-control) experiments. Apart from an example of simple one-dimensional control23, previous experiments have lacked physical interaction even in cases where a robotic arm16,19,24 or hand20 was included in the control loop, because the subjects did not use it to interact with physical objects—an interaction that cannot be fully simulated. This demonstration of multi-degree-of-freedom embodied prosthetic control paves the way towards the development of dexterous prosthetic devices that could ultimately achieve arm and hand function at a near-natural level.

Access optionsAccess options

Rent or Buy article

Get time limited or full article access on ReadCube.

from$8.99

All prices are NET prices.

References

  1. 1.

    , , , & in Dynamic Aspects of Neocortical Function (eds Edelman, G. M., Gall, W. E. & Cowan, W. M.) 501–524 (Wiley & Sons, New York, 1984)

  2. 2.

    , & Primate motor cortex and free arm movements to visual targets in three-dimensional space. II. Coding of the direction of movement by a neuronal population. J. Neurosci. 8, 2928–2937 (1988)

  3. 3.

    Direct cortical representation of drawing. Science 265, 540–542 (1994)

  4. 4.

    & Motor cortical activity during drawing movements: Population representation during lemniscate tracing. J. Neurophysiol. 82, 2705–2718 (1999)

  5. 5.

    & Motor cortical activity during drawing movements: Population representation during spiral tracing. J. Neurophysiol. 82, 2693–2704 (1999)

  6. 6.

    & Motor cortical representation of speed and direction during reaching. J. Neurophysiol. 82, 2676–2692 (1999)

  7. 7.

    et al. Real-time prediction of hand trajectory by ensembles of cortical neurons in primates. Nature 408, 361–365 (2000)

  8. 8.

    , , , & Brain-computer interfaces for communication and control. Clin. Neurophysiol. 113, 767–791 (2002)

  9. 9.

    Cortical neural prosthetics. Annu. Rev. Neurosci. 27, 487–507 (2004)

  10. 10.

    , , & The emerging world of motor neuroprosthetics: A neurosurgical perspective. Neurosurgery 59, 1–14 (2006)

  11. 11.

    , , & Brain-controlled interfaces: Movement restoration with neural prosthetics. Neuron 52, 205–220 (2006)

  12. 12.

    & Restoration of neural output from a paralyzed patient by a direct brain connection. Neuroreport 9, 1707–1711 (1998)

  13. 13.

    , , , & Direct control of a computer from the human central nervous system. IEEE Trans. Rehabil. Eng. 8, 198–202 (2000)

  14. 14.

    , , , & Instant neural control of a movement signal. Nature 416, 141–142 (2002)

  15. 15.

    , & Direct cortical control of 3D neuroprosthetic devices. Science 296, 1829–1832 (2002)

  16. 16.

    et al. Learning to control a brain-machine interface for reaching and grasping by primates. PLoS Biol. 1, 193–208 (2003)

  17. 17.

    , , , & Cognitive control signals for neural prosthetics. Science 305, 258–262 (2004)

  18. 18.

    & Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans. Proc. Natl Acad. Sci. USA 101, 17849–17854 (2004)

  19. 19.

    et al. Cortical ensemble adaptation to represent velocity of an artificial actuator controlled by a brain-machine interface. J. Neurosci. 25, 4681–4693 (2005)

  20. 20.

    et al. Neuronal ensemble control of prosthetic devices by a human with tetraplegia. Nature 442, 164–171 (2006)

  21. 21.

    , & Selection and parameterization of cortical neurons for neuroprosthetic control. J. Neural Eng. 3, 162–171 (2006)

  22. 22.

    , , , & A high-performance brain-computer interface. Nature 442, 195–198 (2006)

  23. 23.

    , , & Real-time control of a robot arm using simultaneously recorded neurons in the motor cortex. Nature Neurosci. 2, 664–670 (1999)

  24. 24.

    , & The general utility of a neuroprosthetic device under direct cortical control. Proc. 25th Annu. Int. Conf. IEEE EMBS 3, 2043–2046 (2003)

  25. 25.

    , & Recursive bayesian decoding of motor cortical signals by particle filtering. J. Neurophysiol. 91, 1899–1907 (2004)

  26. 26.

    , , , & Interpreting spatial and temporal neural activity through a recurrent neural network brain-machine interface. IEEE Trans. Neural Syst. Rehabil. Eng. 13, 213–219 (2005)

  27. 27.

    et al. Mixture of trajectory models for neural decoding of goal-directed movements. J. Neurophysiol. 97, 3763–3780 (2007)

  28. 28.

    , & Extraction algorithms for cortical control of arm prosthetics. Curr. Opin. Neurobiol. 11, 701–707 (2001)

  29. 29.

    Spatial control of arm movements. Exp. Brain Res. 42, 223–227 (1981)

  30. 30.

    Effect of target size on spatial and temporal characteristics of a pointing movement in man. Exp. Brain Res. 54, 121–132 (1984)

Download references

Acknowledgements

We thank S. Clanton and M. Wu for help with software and hardware development, S. Chase for discussions and I. Albrecht for the illustration in Fig. 1a. Support contributed by NIH-NINDS-N01-2-2346 and NIH NSØ5Ø256.

Author information

Affiliations

  1. Department of Neurobiology, School of Medicine, E1440 BST, Lothrop Street, University of Pittsburgh, Pittsburgh, Pennsylvania 15213, USA

    • Meel Velliste
    •  & Andrew B. Schwartz
  2. Department of Bioengineering, 749 Benedum Hall, University of Pittsburgh, Pittsburgh, Pennsylvania 15261, USA

    • Sagi Perel
    • , M. Chance Spalding
    • , Andrew S. Whitford
    •  & Andrew B. Schwartz
  3. Center for the Neural Basis of Cognition, University of Pittsburgh and Carnegie Mellon University, Pittsburgh, Pennsylvania 15213, USA

    • Sagi Perel
    • , M. Chance Spalding
    • , Andrew S. Whitford
    •  & Andrew B. Schwartz
  4. Department of Physical Medicine and Rehabilitation, University of Pittsburgh, Pittsburgh, Pennsylvania 15213, USA

    • Andrew B. Schwartz
  5. McGowan Institute for Regenerative Medicine, University of Pittsburgh, Pittsburgh, Pennsylvania 15219, USA

    • Andrew B. Schwartz
  6. Robotics Institute, Carnegie Mellon University, Pittsburgh, Pennsylvania 15213, USA

    • Andrew B. Schwartz

Authors

  1. Search for Meel Velliste in:

  2. Search for Sagi Perel in:

  3. Search for M. Chance Spalding in:

  4. Search for Andrew S. Whitford in:

  5. Search for Andrew B. Schwartz in:

Competing interests

The authors have applied for a patent on parts of the work described in this paper.

Corresponding author

Correspondence to Andrew B. Schwartz.

Supplementary information

PDF files

  1. 1.

    Supplementary information

    The file contains Supplementary Methods, Supplementary Data, Supplementary Tables 1-4, Supplementary Figures 1-13 with Legends, and Legends to Supplementary Movies 1-5.

Videos

  1. 1.

    Supplementary information

    The file contains Supplementary Movie 1 showing continuous self-feeding by monkey A Continuous self-feeding by monkey A showing 7 consecutive successful trials. The monkey's cortical control is 4-dimensional, including 3 dimensions of endpoint control plus gripper control.

  2. 2.

    Supplementary information

    The file contains Supplementary Movie 2 showing continuous self-feeding by monkey P Continuous self-feeding by monkey P showing 6 consecutive trials (5 successful). Monkey's cortical control is 3-dimensional, i.e. endpoint control. The gripper is controlled as a dimension dependent on endpoint movement: it opens when the arm moves forward and closes when the arm is held stable or moved backward.

  3. 3.

    Supplementary information

    The file contains Supplementary Movie 3 showing target tracking As the monkey makes a reach toward an initial target with the prosthetic arm, the target is displaced so that a direct move to target would knock the food off the presentation device. The monkey then moves the arm endpoint in a curved path to avoid the collision, and successfully obtains the food.

  4. 4.

    Supplementary information

    The file contains Supplementary Movie 4 showing an emergent behaviour: finger licking When a target is presented, the monkey ignores the target and instead moves the arm so as to be able to lick the gripper fingers. This emergent behaviour is outside the task requirements and is a result of embodied control.

  5. 5.

    Supplementary information

    The file contains Supplementary Movie 5 showing an emergent behaviour: using the arm to push food into the mouth. When a marshmallow ends up barely between the monkey's lips at the end of a successful reaching and retrieval, the animal is unable to get the food into its mouth without a helping "hand", so it uses the robotic arm to push the food into its mouth.

About this article

Publication history

Received

Accepted

Published

DOI

https://doi.org/10.1038/nature06996

Further reading

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.