Letter | Published:

Active tactile exploration using a brain–machine–brain interface

Nature volume 479, pages 228231 (10 November 2011) | Download Citation


Brain–machine interfaces1,2 use neuronal activity recorded from the brain to establish direct communication with external actuators, such as prosthetic arms. It is hoped that brain–machine interfaces can be used to restore the normal sensorimotor functions of the limbs, but so far they have lacked tactile sensation. Here we report the operation of a brain–machine–brain interface (BMBI) that both controls the exploratory reaching movements of an actuator and allows signalling of artificial tactile feedback through intracortical microstimulation (ICMS) of the primary somatosensory cortex. Monkeys performed an active exploration task in which an actuator (a computer cursor or a virtual-reality arm) was moved using a BMBI that derived motor commands from neuronal ensemble activity recorded in the primary motor cortex. ICMS feedback occurred whenever the actuator touched virtual objects. Temporal patterns of ICMS encoded the artificial tactile properties of each object. Neuronal recordings and ICMS epochs were temporally multiplexed to avoid interference. Two monkeys operated this BMBI to search for and distinguish one of three visually identical objects, using the virtual-reality arm to identify the unique artificial texture associated with each. These results suggest that clinical motor neuroprostheses might benefit from the addition of ICMS feedback to generate artificial somatic perceptions associated with mechanical, robotic or even virtual prostheses.

Access optionsAccess options

Rent or Buy article

Get time limited or full article access on ReadCube.


All prices are NET prices.


  1. 1.

    & Brain-machine interfaces: past, present and future. Trends Neurosci. 29, 536–546 (2006)

  2. 2.

    & Principles of neural ensemble physiology underlying the operation of brain-machine interfaces. Nature Rev. Neurosci. 10, 530–540 (2009)

  3. 3.

    , , & Real-time control of a robot arm using simultaneously recorded neurons in the motor cortex. Nature Neurosci. 2, 664–670 (1999)

  4. 4.

    , , , & Cortical control of a prosthetic arm for self-feeding. Nature 453, 1098–1101 (2008)

  5. 5.

    , & Direct control of paralysed muscles by cortical neurons. Nature 456, 639–642 (2008)

  6. 6.

    et al. Real-time prediction of hand trajectory by ensembles of cortical neurons in primates. Nature 408, 361–365 (2000)

  7. 7.

    , & Direct cortical control of 3D neuroprosthetic devices. Science 296, 1829–1832 (2002)

  8. 8.

    , , , & Instant neural control of a movement signal. Nature 416, 141–142 (2002)

  9. 9.

    et al. Learning to control a brain-machine interface for reaching and grasping by primates. PLoS Biol. 1, e42 (2003)

  10. 10.

    & Roles of glabrous skin receptors and sensorimotor memory in automatic control of precision grip when lifting rougher or more slippery objects. Exp. Brain Res. 56, 550–564 (1984)

  11. 11.

    & Modulation of grip force with load force during point-to-point arm movements. Exp. Brain Res. 95, 131–143 (1993)

  12. 12.

    , & The neural basis of haptic object processing. Can. J. Exp. Psychol. 61, 219–229 (2007)

  13. 13.

    Aggarwal, V. Ramos, A., Acharya, S. & Thakor, N. V. A brain-computer interface with vibrotactile biofeedback for haptic information. J. Neuroeng. Rehabil. 4, 40 (2007)

  14. 14.

    , , & Electrotactile and vibrotactile displays for sensory substitution systems. IEEE Trans. Biomed. Eng. 38, 1–16 (1991)

  15. 15.

    , & Sensory capacity of reinnervated skin after redirection of amputated upper limb nerves to the chest. Brain 132, 1441–1448 (2009)

  16. 16.

    , , , & A brain-machine interface instructed by direct intracortical microstimulation. Front. Integr. Neurosci. 3, 20 (2009)

  17. 17.

    , , , & Stimulation of human somatosensory cortex: tactile and body displacement perceptions in medial regions. Exp. Brain Res. 93, 173–176 (1993)

  18. 18.

    , , & Electrical stimulation of the proprioceptive cortex (area 3a) used to instruct a behaving monkey. IEEE Trans. Neural Syst. Rehabil. Eng. 16, 32–36 (2008)

  19. 19.

    , , & Somatosensory discrimination based on cortical microstimulation. Nature 392, 387–390 (1998)

  20. 20.

    , , , & Primate reaching cued by multichannel spatiotemporal cortical microstimulation. J. Neurosci. 27, 5593–5602 (2007)

  21. 21.

    & Hand movements: a window into haptic object recognition. Cognit. Psychol. 19, 342–368 (1987)

  22. 22.

    et al. Cortical ensemble adaptation to represent velocity of an artificial actuator controlled by a brain-machine interface. J. Neurosci. 25, 4681–4693 (2005)

  23. 23.

    et al. Unscented Kalman filter for brain-machine interfaces. PLoS ONE 4, e6243 (2009)

  24. 24.

    , & Vibration-entrained and premovement activity in monkey primary somatosensory cortex. J. Neurophysiol. 72, 1654–1673 (1994)

  25. 25.

    , & Neuronal activity in primary motor cortex differs when monkeys perform somatosensory and visually guided wrist movements. Exp. Brain Res. 167, 571–586 (2005)

  26. 26.

    & Neural correlates of mental rehearsal in dorsal premotor cortex. Nature 431, 993–996 (2004)

  27. 27.

    , & Coding the location of the arm by sight. Science 290, 1782–1786 (2000)

  28. 28.

    & Tools for the body (schema). Trends Cogn. Sci. 8, 79–86 (2004)

  29. 29.

    , & Observation-based learning for brain-machine interfaces. Curr. Opin. Neurobiol. 18, 589–594 (2008)

  30. 30.

    & Neurons in primary motor cortex engaged during action observation. Eur. J. Neurosci. 31, 386–398 (2010)

Download references


We thank D. Dimitrov for conducting the animal neurosurgeries; G. Lehew and J. Meloy for building brain implants; J. Fruh for rendering the virtual-reality monkey arm; T. Phillips, L. Oliveira and S. Halkiotis for technical support; and E. Thomson and Z. Li for comments. This research was supported by DARPA (award N66001-06-C-2019), TATRC (award W81XWH-08-2-0119), the NIH (award NS073125), NICHD/OD (award RC1HD063390) and NIH Director’s Pioneer Award DP1OD006798, to M.A.L.N. The content is solely the responsibility of the authors and does not necessarily represent the official views of the Office of the NIH Director or the NIH.

Author information


  1. Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708, USA

    • Joseph E. O’Doherty
    • , Peter J. Ifft
    • , Katie Z. Zhuang
    •  & Miguel A. L. Nicolelis
  2. Center for Neuroengineering, Duke University, Durham, North Carolina 27710, USA

    • Joseph E. O’Doherty
    • , Mikhail A. Lebedev
    • , Peter J. Ifft
    • , Katie Z. Zhuang
    •  & Miguel A. L. Nicolelis
  3. Department of Neurobiology, Duke University, Durham, North Carolina 27710, USA

    • Mikhail A. Lebedev
    •  & Miguel A. L. Nicolelis
  4. STI IMT, École Polytechnique Fédérale de Lausanne, Lausanne CH1015, Switzerland

    • Solaiman Shokur
    •  & Hannes Bleuler
  5. Department of Psychology and Neuroscience, Duke University, Durham, North Carolina 27708, USA

    • Miguel A. L. Nicolelis
  6. Edmond and Lily Safra International Institute of Neuroscience of Natal, Natal 59066-060, Brazil

    • Miguel A. L. Nicolelis


  1. Search for Joseph E. O’Doherty in:

  2. Search for Mikhail A. Lebedev in:

  3. Search for Peter J. Ifft in:

  4. Search for Katie Z. Zhuang in:

  5. Search for Solaiman Shokur in:

  6. Search for Hannes Bleuler in:

  7. Search for Miguel A. L. Nicolelis in:


J.E.O’D., M.A.L. and M.A.L.N. designed experiments, analysed data and wrote the paper; J.E.O’D., M.A.L., P.J.I. and K.Z.Z. conducted experiments; and S.S. and H.B. developed the virtual-reality monkey arm.

Competing interests

The authors declare no competing financial interests.

Corresponding author

Correspondence to Miguel A. L. Nicolelis.

Supplementary information

PDF files

  1. 1.

    Supplementary Figures

    This file contains Supplementary Figures 1-7 with legends.


  1. 1.

    Supplementary Movie 1

    This movie sequence depicts monkey M performing task IV in HC mode. Annotations are provided during the freeze frames. The audio track is the spiking activity of a single M1 neuron; ICMS pulses can be completion of a trial and the delivery of a juice reward. Please note that the monkeys could not hear the ICMS pulses or neural activity; these sounds are included for the benefit of the viewer.

  2. 2.

    Supplementary Movie 2

    This movie depicts the same sequence as Supplementary Movie 1 but for BC. Please note that the monkeys could not hear the ICMS pulses or neural activity; these sounds are included for the benefit of the viewer.

  3. 3.

    Supplementary Movie 3

    This movie sequence depicts monkey M performing task V in HC. Please note that the monkeys could not hear the ICMS pulses or neural activity; these sounds are included for the benefit of the viewer.

  4. 4.

    Supplementary Movie 4

    This movie depicts the same sequence as Supplementary Movie 3 but for BC. Please note that the monkeys could not hear the ICMS pulses or neural activity; these sounds are included for the benefit of the viewer.

About this article

Publication history






Further reading


By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.