Proprioception—the sense of the body's position in space—is important to natural movement planning and execution and will likewise be necessary for successful motor prostheses and brain–machine interfaces (BMIs). Here we demonstrate that monkeys were able to learn to use an initially unfamiliar multichannel intracortical microstimulation signal, which provided continuous information about hand position relative to an unseen target, to complete accurate reaches. Furthermore, monkeys combined this artificial signal with vision to form an optimal, minimum-variance estimate of relative hand position. These results demonstrate that a learning-based approach can be used to provide a rich artificial sensory feedback signal, suggesting a new strategy for restoring proprioception to patients using BMIs, as well as a powerful new tool for studying the adaptive mechanisms of sensory integration.
Subscribe to Journal
Get full journal access for 1 year
only $4.92 per issue
All prices are NET prices.
VAT will be added later in the checkout.
Tax calculation will be finalised during checkout.
Rent or Buy article
Get time limited or full article access on ReadCube.
All prices are NET prices.
Sober, S.J. & Sabes, P.N. Multisensory integration during motor planning. J. Neurosci. 23, 6982–6992 (2003).
Sober, S.J. & Sabes, P.N. Flexible strategies for sensory integration during motor planning. Nat. Neurosci. 8, 490–497 (2005).
van Beers, R.J., Sittig, A.C. & Gon, J.J. Integration of proprioceptive and visual position-information: an experimentally supported model. J. Neurophysiol. 81, 1355–1364 (1999).
Ernst, M.O. & Banks, M.S. Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415, 429–433 (2002).
Morgan, M.L., Deangelis, G.C. & Angelaki, D.E. Multisensory integration in macaque visual cortex depends on cue reliability. Neuron 59, 662–673 (2008).
McGuire, L.M. & Sabes, P.N. Sensory transformations and the use of multiple reference frames for reach planning. Nat. Neurosci. 12, 1056–1061 (2009).
Sainburg, R.L., Poizner, H. & Ghez, C. Loss of proprioception produces deficits in interjoint coordination. J. Neurophysiol. 70, 2136–2147 (1993).
Sainburg, R.L., Ghilardi, M.F., Poizner, H. & Ghez, C. Control of limb dynamics in normal subjects and patients without proprioception. J. Neurophysiol. 73, 820–835 (1995).
Suminski, A.J., Tkach, D.C., Fagg, A.H. & Hatsopoulos, N.G. Incorporating feedback from multiple sensory modalities enhances brain–machine interface control. J. Neurosci. 30, 16777–16787 (2010).
Fagg, A.H. et al. Biomimetic brain machine interfaces for the control of movement. J. Neurosci. 27, 11842–11846 (2007).
Choi, J.S., DiStasio, M.M., Brockmeier, A.J. & Francis, J.T. An electric field model for prediction of somatosensory (S1) cortical field potentials induced by ventral posterior lateral (VPL) thalamic microstimulation. IEEE Trans. Neural Syst. Rehabil. Eng. 20, 161–169 (2012).
Daly, J., Liu, J., Aghagolzadeh, M. & Oweiss, K. Optimal space–time precoding of artificial sensory feedback through mutichannel microstimulation in bi-directional brain–machine interfaces. J. Neural Eng. 9, 065004 (2012).
Weber, D.J., Friesen, R. & Miller, L.E. Interfacing the somatosensory system to restore touch and proprioception: essential considerations. J. Mot. Behav. 44, 403–418 (2012).
Tabot, G.A. et al. Restoring the sense of touch with a prosthetic hand through a brain interface. Proc. Natl. Acad. Sci. USA 110, 18279–18284 (2013).
Held, R. & Hein, A. Movement-produced stimulation in the development of visually guided behavior. J. Comp. Physiol. Psychol. 56, 872–876 (1963).
Xu, J., Yu, L., Rowland, B.A., Stanford, T.R. & Stein, B.E. Incorporating cross-modal statistics in the development and maintenance of multisensory integration. J. Neurosci. 32, 2287–2298 (2012).
Burge, J., Ernst, M.O. & Banks, M.S. The statistical determinants of adaptation rate in human reaching. J. Vis. 8, 1–19 (2008).
Simani, M.C., McGuire, L.M. & Sabes, P.N. Visual-shift adaptation is composed of separable sensory and task-dependent effects. J. Neurophysiol. 98, 2827–2841 (2007).
Zaidel, A., Turner, A.H. & Angelaki, D.E. Multisensory calibration is independent of cue reliability. J. Neurosci. 31, 13949–13962 (2011).
Makin, J.G., Fellows, M.R. & Sabes, P.N. Learning multisensory integration and coordinate transformation via density estimation. PLoS Comput. Biol. 9, e1003035 (2013).
Kalaska, J.F. The representation of arm movements in postcentral and parietal cortex. Can. J. Physiol. Pharmacol. 66, 455–463 (1988).
Kalaska, J.F. Parietal cortex area 5 and visuomotor behavior. Can. J. Physiol. Pharmacol. 74, 483–498 (1996).
Block, H.J. & Bastian, A.J. Sensory reweighting in targeted reaching: effects of conscious effort, error history, and target salience. J. Neurophysiol. 103, 206–217 (2010).
Cheng, S. & Sabes, P.N. Calibration of visually-guided reaching is driven by error corrective learning and internal dynamics. J. Neurophysiol. 97, 3057–3069 (2007).
Izawa, J. & Shadmehr, R. Learning from sensory and reward prediction errors during motor adaptation. PLoS Comput. Biol. 7, e1002012 (2011).
Kalaska, J.F., Cohen, D.A., Prud'homme, M. & Hyde, M.L. Parietal area 5 neuronal activity encodes movement kinematics, not movement dynamics. Exp. Brain Res. 80, 351–364 (1990).
Batista, A.P., Buneo, C.A., Snyder, L.H. & Andersen, R.A. Reach plans in eye-centered coordinates. Science 285, 257–260 (1999).
Battaglia-Mayer, A. et al. Early coding of reaching in the parietooccipital cortex. J. Neurophysiol. 83, 2374–2391 (2000).
Graziano, M.S., Cooke, D.F. & Taylor, C.S. Coding the location of the arm by sight. Science 290, 1782–1786 (2000).
Bremner, L.R. & Andersen, R.A. Coding of the reach vector in parietal area 5d. Neuron 75, 342–351 (2012).
Deneve, S., Latham, P.E. & Pouget, A. Efficient computation and cue integration with noisy population codes. Nat. Neurosci. 4, 826–831 (2001).
Ma, W.J., Beck, J.M., Latham, P.E. & Pouget, A. Bayesian inference with probabilistic population codes. Nat. Neurosci. 9, 1432–1438 (2006).
Chang, S.W. & Snyder, L.H. Idiosyncratic and systematic aspects of spatial representations in the macaque parietal cortex. Proc. Natl. Acad. Sci. USA 107, 7951–7956 (2010).
Marzocchi, N., Breveglieri, R., Galletti, C. & Fattori, P. Reaching activity in parietal area V6A of macaque: eye influence on arm activity or retinocentric coding of reaching movements? Eur. J. Neurosci. 27, 775–789 (2008).
McGuire, L.M. & Sabes, P.N. Heterogeneous representations in the superior parietal lobule are common across reaches to visual and proprioceptive targets. J. Neurosci. 31, 6661–6673 (2011).
Wise, S.P., Boussaoud, D., Johnson, P.B. & Caminiti, R. Premotor and parietal cortex: corticocortical connectivity and combinatorial computations. Annu. Rev. Neurosci. 20, 25–42 (1997).
Andersen, R.A. & Buneo, C.A. Intentional maps in posterior parietal cortex. Annu. Rev. Neurosci. 25, 189–220 (2002).
Battaglia–Mayer, A., Caminiti, R., Lacquaniti, F. & Zago, M. Multiple levels of representation of reaching in the parieto-frontal network. Cereb. Cortex 13, 1009–1022 (2003).
Sabes, P.N. Sensory integration for reaching: models of optimality in the context of behavior and the underlying neural circuits. Prog. Brain Res. 191, 195–209 (2011).
Yttri, E.A., Liu, Y. & Snyder, L.H. Lesions of cortical area LIP affect reach onset only when the reach is accompanied by a saccade, revealing an active eye–hand coordination circuit. Proc. Natl. Acad. Sci. USA 110, 2371–2376 (2013).
Levy, I., Schluppeck, D., Heeger, D.J. & Glimcher, P.W. Specificity of human cortical areas for reaches and saccades. J. Neurosci. 27, 4687–4696 (2007).
Pearson, R.C. & Powell, T.P. The cortico-cortical connections to area 5 of the parietal lobe from the primary somatic sensory cortex of the monkey. Proc. R. Soc. Lond. B Biol. Sci. 200, 103–108 (1978).
Lewis, J.W. & Van Essen, D.C. Corticocortical connections of visual, sensorimotor, and multimodal processing areas in the parietal lobe of the macaque monkey. J. Comp. Neurol. 428, 112–137 (2000).
Redding, G.M. & Wallace, B. Strategic calibration and spatial alignment: a model from prism adaptation. J. Mot. Behav. 34, 126–138 (2002).
van Beers, R.J., Sittig, A.C. & Denier van der Gon, J.J. How humans combine simultaneous proprioceptive and visual position information. Exp. Brain Res. 111, 253–261 (1996).
Ghez, C. & Sainburg, R. Proprioceptive control of interjoint coordination. Can. J. Physiol. Pharmacol. 73, 273–284 (1995).
Novak, K.E., Miller, L.E. & Houk, J.C. The use of overlapping submovements in the control of rapid hand movements. Exp. Brain Res. 144, 351–364 (2002).
Efron, B. & Tibshirani, R.J. An Introduction to the Bootstrap (Chapman & Hall, 1993).
We thank M.R. Fellows for initial behavioral training; R.R. Torres for suggesting the 2AFC task for ICMS detection; A. Leggitt for help with data analysis; K.B. Andrews and K. MacLeod for animal-related support; and J.G. Makin, A. Yazdan-Shahmorad and T.L. Hanson for discussion and comments on the manuscript. This research was supported by the Defense Advanced Research Projects Agency (DARPA) Reorganization and Plasticity to Accelerate Injury Recovery (REPAIR; N66001-10-C-2010) and the US National Institutes of Health NEI (EY015679, EY007120).
The authors declare no competing financial interests.
Integrated supplementary information
Animals sit in a virtual reality environment without direct view of the arm. A mirror reflects images on rear–project screen and is adjusted so that visual cues appear in the horizontal plane of the reaching hand. Hand position was tracked electromagnetically (Polhemus Liberty, Colchester, VT), and feedback about the position of the hand, relative to the target, was delivered via a random–dot visual flow field (inset) or via patterned ICMS.
Location of electrode arrays within S1 (right) and example neuronal receptive fields (left) for Monkey D (top) and Monkey F (bottom). Colored circles (right) indicate array locations corresponding to the matching colored receptive fields (left). Neurons responded mainly to light touch; circles with dark borders correspond to cells that responded to limb movements (active and passive).
Behavioral performance measures are shown as a function of the cumulative number of VIS+ICMS trials performed (training and testing) for Monkey D. The data, collected during testing sessions, were smoothed for clarity (Gaussian window with standard deviation of 2.8 training sessions, translating to approximately 2,500 training trials for Monkey D). The visual coherence on training trials was decreased across training sessions (indicated by gray bars at the bottom of the figure and vertical gray lines at the transitions). The left, thin green line denotes the onset of ICMS–only trials, where target sizes were temporarily larger than in the other trial conditions; the right, thick green line denotes the beginning of ICMS-trials with targets of standard size. (a) percent correct trials; (b) number of movement segments measured online error corrections; (c) movement time for the trial is normalized by the initial distance to the reach target; (d) path length, normalized as in c. See Supplementary Table 1 for additional details on the training and testing schedule.
Sample movement paths from randomly selected successful trials for Monkeys D (a) and F (b) for seven feedback conditions. Each reach begins at the fixed central starting point and ends within the unseen reach target (here depicted in gray for clarity).
(a), Standard deviation of initial angle (relative to target angle) for the different trial types and visual coherences. Plots follow the same conventions as Figure 6a in the main text. This figure demonstrates that the qualitative results of the main text—in particular the good correspondence with the minimum variance model of sensory integration—are not an artifact of the subtracting the smoothed estimates of mean initial angle used there. Error bars denote standard error of the mean. (b) Smoothed mean initial angle for Monkey F, relative to the target angle. Monkey F did not exhibit the marked differences in mean initial angle across feedback types that were observed for Monkey D (Figure 6b, main text). (c) Visual cue weighting for Monkey F in the VIS+ICMS trials, as a function of dot–field coherence. The results are consistent with the minimum variance model, however the analysis has poor statistical power due to the similarity in mean initial angle across feedback types. Blue filled circles: visual cue weighting estimated from data; black unfilled circles: minimum variance model prediction; error–bars: bootstrapped estimates of standard error.
Supplementary Figures 1–5 and Supplementary Tables 1–4 (PDF 1044 kb)
This video is generated from behavioral data collected on 29 Dec., 2013. The left panel shows a simulated version of a portion of the virtual reality environment that the monkey viewed during the trial: position of the fingertip (filled white circle), dot field (here displayed at 100% coherence for clarity; the coherence presented to the monkey for this trial was 50%), and start target (open green circle, radius 10 mm). The right panel shows the patterns of ICMS delivered during the trial, where each vertical line denotes a pulse of stimulation from an electrode with a preferred direction indicated by the corresponding red arrow at left. Stimulation rasters shown have been subsampled for clarity. In the video, the monkey is shown first acquiring the start target. After an instructed delay interval, during which VIS+ICMS information about the instructed movement vector become available, a go cue sounds (noted by text), and the monkey completes the reach to the unseen, 12 mm radius reach target (illustrated here with dashed white circle). (MP4 872 kb)
This video is generated from behavioral data collected on 29 Dec., 2013. The left panel shows a simulated version of a portion of the virtual reality environment that the monkey viewed during the trial: position of the fingertip (filled white circle), dot field (here displayed at 100% coherence for clarity; the coherence presented to the monkey for this trial was 50%), and start target (open green circle, radius 10 mm). The right panel shows the patterns of ICMS delivered during the trial, where each vertical line denotes a pulse of stimulation from an electrode with a preferred direction indicated by the corresponding red arrow at left. Stimulation rasters shown have been subsampled for clarity. In the video, the monkey is shown first acquiring the start target. After an instructed delay interval, during which ICMS information about the instructed movement vector become available, a go cue sounds (noted by text), and the monkey completes the reach to the unseen, 12 mm radius reach target (illustrated here with dashed white circle). (MP4 121 kb)
About this article
Cite this article
Dadarlat, M., O'Doherty, J. & Sabes, P. A learning-based approach to artificial sensory feedback leads to optimal integration. Nat Neurosci 18, 138–144 (2015). https://doi.org/10.1038/nn.3883
Brain Stimulation (2021)
A universal closed-loop brain–machine interface framework design and its application to a joint prosthesis
Neural Computing and Applications (2021)
The Neuroscientist (2021)