Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Monkey brains 'feel' virtual objects

Macaques use a brain-controlled virtual hand to identify artificial texture of objects.

Monkeys have been able to use virtual limbs to 'feel' virtual objects. Credit: Katie Zhuang

An international team of researchers has developed a brain implant that enables monkeys to examine virtual objects by means of a virtual arm controlled by their brain. The device represents the next step towards the development of prosthetic limbs or robotic suits that would allow users to interact with their world without depending entirely on visual feedback.

The researchers, led by Miguel Nicolelis of Duke University in Durham, North Carolina, inserted electrodes into the motor cortex and somatosensory cortex of two monkeys. The motor cortex is the brain region involved in performing voluntary movement, whereas the somatosensory cortex processes input received from cells in the body that are sensitive to, among other sensory experiences, touch.

The monkeys were trained to use only their brain to explore virtual objects on a computer screen by moving a virtual image of an arm. Electrodes in the motor cortex recorded the monkeys' intentions to move and relayed that information to the virtual world. As the virtual hand passed over objects on the screen, electrical signals were fed into the animal's somatosensory cortex, providing 'tactile' feedback.

In a task involving a choice between two visually identical objects, the monkeys were able to distinguish between a reward-producing object, which was associated with an electrical stimulation when 'touched', and an object that produced neither electrical stimulation nor a treat. This shows that the brain can decode information about the sense of touch without any stimulation of the animal's skin, says Nicolelis. His team reports their results today in Nature1.

"We don't know what the animals perceived, but it was a sensation that was created artificially by linking the virtual fingers to the brain directly," he says.

Grasping reality

A major challenge, the authors say, was to keep the sensory input and the motor output from interfering with each other, because the recording and stimulating electrodes were placed in connected brain tissue. The researchers solved the problem by alternating between a situation in which the brain–machine–brain interface was stimulating the brain and one in which motor cortex activity was recorded; half of every 100 milliseconds was devoted to each process.

"This enforces some constraints on the exchange of information between the sensory and motor areas," says neuroscientist Stefano Panzeri of the Italian Institute of Technology in Genoa, who was not involved in the study. But because the animals learn to use the information, this experiment shows that the brain can exchange information even under these constraints, he explains.

This bidirectional communication is a critical step in the development of brain–machine interfaces, says Rodrigo Quian Quiroga, a neuroscientist at the University of Leicester, UK who was also not involved in the study. Previous brain–machine interfaces have relied on visual feedback, a less-than-ideal situation for someone trying to use a robotic prosthetic, he says. "If you want to reach and grasp a glass, visual feedback won't help you," says Quian Quiroga. "It's the sensory feedback that tells you if you have a good grip or if you are about to drop it."

Sensory information will be a necessary component of Nicolelis's grand goal and that of the Walk Again Project, an international effort in which he is involved: to build an exoskeleton suit that can restore mobility to patients who are severely paralysed.

"This is going to be essential for the clinical application that we want to create and test within the next three years," says Nicolelis. He hopes that, with the help of collaborators in the Walk Again Project, a suit driven by a brain–machine interface will be demonstrated at the 2014 World Cup in Brazil, Nicolelis' homeland, with the opening kick of the ball being delivered by a young Brazilian with quadriplegia.


  1. O’Doherty, J. E. et al. Nature advance online publication, (2011).

Download references


Related links

Related links

Related links in Nature Research

Nature Neuroscience

Spinal Cord

Neuropod Podcasts

Related external links

Laboratory of Miguel Nicolelis at Duke University Medical Center

Walk Again Project

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Young, S. Monkey brains 'feel' virtual objects. Nature (2011).

Download citation

  • Published:

  • DOI:


Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing