Arm movement is well represented in populations of neurons recorded from the motor cortex1,2,3,4,5,6,7. Cortical activity patterns have been used in the new field of brain–machine interfaces8,9,10,11 to show how cursors on computer displays can be moved in two- and three-dimensional space12,13,14,15,16,17,18,19,20,21,22. Although the ability to move a cursor can be useful in its own right, this technology could be applied to restore arm and hand function for amputees and paralysed persons. However, the use of cortical signals to control a multi-jointed prosthetic device for direct real-time interaction with the physical environment (‘embodiment’) has not been demonstrated. Here we describe a system that permits embodied prosthetic control; we show how monkeys (Macaca mulatta) use their motor cortical activity to control a mechanized arm replica in a self-feeding task. In addition to the three dimensions of movement, the subjects’ cortical signals also proportionally controlled a gripper on the end of the arm. Owing to the physical interaction between the monkey, the robotic arm and objects in the workspace, this new task presented a higher level of difficulty than previous virtual (cursor-control) experiments. Apart from an example of simple one-dimensional control23, previous experiments have lacked physical interaction even in cases where a robotic arm16,19,24 or hand20 was included in the control loop, because the subjects did not use it to interact with physical objects—an interaction that cannot be fully simulated. This demonstration of multi-degree-of-freedom embodied prosthetic control paves the way towards the development of dexterous prosthetic devices that could ultimately achieve arm and hand function at a near-natural level.
Access optionsAccess options
Subscribe to Journal
Get full journal access for 1 year
only $3.90 per issue
All prices are NET prices.
VAT will be added later in the checkout.
Rent or Buy article
Get time limited or full article access on ReadCube.
All prices are NET prices.
We thank S. Clanton and M. Wu for help with software and hardware development, S. Chase for discussions and I. Albrecht for the illustration in Fig. 1a. Support contributed by NIH-NINDS-N01-2-2346 and NIH NSØ5Ø256.
The file contains Supplementary Movie 1 showing continuous self-feeding by monkey A Continuous self-feeding by monkey A showing 7 consecutive successful trials. The monkey's cortical control is 4-dimensional, including 3 dimensions of endpoint control plus gripper control.
The file contains Supplementary Movie 2 showing continuous self-feeding by monkey P Continuous self-feeding by monkey P showing 6 consecutive trials (5 successful). Monkey's cortical control is 3-dimensional, i.e. endpoint control. The gripper is controlled as a dimension dependent on endpoint movement: it opens when the arm moves forward and closes when the arm is held stable or moved backward.
The file contains Supplementary Movie 3 showing target tracking As the monkey makes a reach toward an initial target with the prosthetic arm, the target is displaced so that a direct move to target would knock the food off the presentation device. The monkey then moves the arm endpoint in a curved path to avoid the collision, and successfully obtains the food.
The file contains Supplementary Movie 4 showing an emergent behaviour: finger licking When a target is presented, the monkey ignores the target and instead moves the arm so as to be able to lick the gripper fingers. This emergent behaviour is outside the task requirements and is a result of embodied control.
The file contains Supplementary Movie 5 showing an emergent behaviour: using the arm to push food into the mouth. When a marshmallow ends up barely between the monkey's lips at the end of a successful reaching and retrieval, the animal is unable to get the food into its mouth without a helping "hand", so it uses the robotic arm to push the food into its mouth.
About this article
International Journal of Intelligent Robotics and Applications (2018)