Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

A learning-based approach to artificial sensory feedback leads to optimal integration

Abstract

Proprioception—the sense of the body's position in space—is important to natural movement planning and execution and will likewise be necessary for successful motor prostheses and brain–machine interfaces (BMIs). Here we demonstrate that monkeys were able to learn to use an initially unfamiliar multichannel intracortical microstimulation signal, which provided continuous information about hand position relative to an unseen target, to complete accurate reaches. Furthermore, monkeys combined this artificial signal with vision to form an optimal, minimum-variance estimate of relative hand position. These results demonstrate that a learning-based approach can be used to provide a rich artificial sensory feedback signal, suggesting a new strategy for restoring proprioception to patients using BMIs, as well as a powerful new tool for studying the adaptive mechanisms of sensory integration.

Access options

Rent or Buy article

Get time limited or full article access on ReadCube.

from$8.99

All prices are NET prices.

Figure 1: Behavioral task and sensory feedback.
Figure 2: Comparison of task performance across sensory feedback conditions.
Figure 3: Evolution of performance over training (monkey F).
Figure 4: Monkeys estimate both target distance and direction from sensory feedback.
Figure 5: Directed error correction.
Figure 6: Integration of vision and ICMS minimizes reach variance.

References

  1. 1

    Sober, S.J. & Sabes, P.N. Multisensory integration during motor planning. J. Neurosci. 23, 6982–6992 (2003).

    CAS  Article  Google Scholar 

  2. 2

    Sober, S.J. & Sabes, P.N. Flexible strategies for sensory integration during motor planning. Nat. Neurosci. 8, 490–497 (2005).

    CAS  Article  Google Scholar 

  3. 3

    van Beers, R.J., Sittig, A.C. & Gon, J.J. Integration of proprioceptive and visual position-information: an experimentally supported model. J. Neurophysiol. 81, 1355–1364 (1999).

    CAS  Article  Google Scholar 

  4. 4

    Ernst, M.O. & Banks, M.S. Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415, 429–433 (2002).

    CAS  Article  Google Scholar 

  5. 5

    Morgan, M.L., Deangelis, G.C. & Angelaki, D.E. Multisensory integration in macaque visual cortex depends on cue reliability. Neuron 59, 662–673 (2008).

    CAS  Article  Google Scholar 

  6. 6

    McGuire, L.M. & Sabes, P.N. Sensory transformations and the use of multiple reference frames for reach planning. Nat. Neurosci. 12, 1056–1061 (2009).

    CAS  Article  Google Scholar 

  7. 7

    Sainburg, R.L., Poizner, H. & Ghez, C. Loss of proprioception produces deficits in interjoint coordination. J. Neurophysiol. 70, 2136–2147 (1993).

    CAS  Article  Google Scholar 

  8. 8

    Sainburg, R.L., Ghilardi, M.F., Poizner, H. & Ghez, C. Control of limb dynamics in normal subjects and patients without proprioception. J. Neurophysiol. 73, 820–835 (1995).

    CAS  Article  Google Scholar 

  9. 9

    Suminski, A.J., Tkach, D.C., Fagg, A.H. & Hatsopoulos, N.G. Incorporating feedback from multiple sensory modalities enhances brain–machine interface control. J. Neurosci. 30, 16777–16787 (2010).

    CAS  Article  Google Scholar 

  10. 10

    Fagg, A.H. et al. Biomimetic brain machine interfaces for the control of movement. J. Neurosci. 27, 11842–11846 (2007).

    CAS  Article  Google Scholar 

  11. 11

    Choi, J.S., DiStasio, M.M., Brockmeier, A.J. & Francis, J.T. An electric field model for prediction of somatosensory (S1) cortical field potentials induced by ventral posterior lateral (VPL) thalamic microstimulation. IEEE Trans. Neural Syst. Rehabil. Eng. 20, 161–169 (2012).

    Article  Google Scholar 

  12. 12

    Daly, J., Liu, J., Aghagolzadeh, M. & Oweiss, K. Optimal space–time precoding of artificial sensory feedback through mutichannel microstimulation in bi-directional brain–machine interfaces. J. Neural Eng. 9, 065004 (2012).

    Article  Google Scholar 

  13. 13

    Weber, D.J., Friesen, R. & Miller, L.E. Interfacing the somatosensory system to restore touch and proprioception: essential considerations. J. Mot. Behav. 44, 403–418 (2012).

    Article  Google Scholar 

  14. 14

    Tabot, G.A. et al. Restoring the sense of touch with a prosthetic hand through a brain interface. Proc. Natl. Acad. Sci. USA 110, 18279–18284 (2013).

    CAS  Article  Google Scholar 

  15. 15

    Held, R. & Hein, A. Movement-produced stimulation in the development of visually guided behavior. J. Comp. Physiol. Psychol. 56, 872–876 (1963).

    CAS  Article  Google Scholar 

  16. 16

    Xu, J., Yu, L., Rowland, B.A., Stanford, T.R. & Stein, B.E. Incorporating cross-modal statistics in the development and maintenance of multisensory integration. J. Neurosci. 32, 2287–2298 (2012).

    CAS  Article  Google Scholar 

  17. 17

    Burge, J., Ernst, M.O. & Banks, M.S. The statistical determinants of adaptation rate in human reaching. J. Vis. 8, 1–19 (2008).

    Article  Google Scholar 

  18. 18

    Simani, M.C., McGuire, L.M. & Sabes, P.N. Visual-shift adaptation is composed of separable sensory and task-dependent effects. J. Neurophysiol. 98, 2827–2841 (2007).

    CAS  Article  Google Scholar 

  19. 19

    Zaidel, A., Turner, A.H. & Angelaki, D.E. Multisensory calibration is independent of cue reliability. J. Neurosci. 31, 13949–13962 (2011).

    CAS  Article  Google Scholar 

  20. 20

    Makin, J.G., Fellows, M.R. & Sabes, P.N. Learning multisensory integration and coordinate transformation via density estimation. PLoS Comput. Biol. 9, e1003035 (2013).

    CAS  Article  Google Scholar 

  21. 21

    Kalaska, J.F. The representation of arm movements in postcentral and parietal cortex. Can. J. Physiol. Pharmacol. 66, 455–463 (1988).

    CAS  Article  Google Scholar 

  22. 22

    Kalaska, J.F. Parietal cortex area 5 and visuomotor behavior. Can. J. Physiol. Pharmacol. 74, 483–498 (1996).

    CAS  PubMed  Google Scholar 

  23. 23

    Block, H.J. & Bastian, A.J. Sensory reweighting in targeted reaching: effects of conscious effort, error history, and target salience. J. Neurophysiol. 103, 206–217 (2010).

    Article  Google Scholar 

  24. 24

    Cheng, S. & Sabes, P.N. Calibration of visually-guided reaching is driven by error corrective learning and internal dynamics. J. Neurophysiol. 97, 3057–3069 (2007).

    Article  Google Scholar 

  25. 25

    Izawa, J. & Shadmehr, R. Learning from sensory and reward prediction errors during motor adaptation. PLoS Comput. Biol. 7, e1002012 (2011).

    CAS  Article  Google Scholar 

  26. 26

    Kalaska, J.F., Cohen, D.A., Prud'homme, M. & Hyde, M.L. Parietal area 5 neuronal activity encodes movement kinematics, not movement dynamics. Exp. Brain Res. 80, 351–364 (1990).

    CAS  Article  Google Scholar 

  27. 27

    Batista, A.P., Buneo, C.A., Snyder, L.H. & Andersen, R.A. Reach plans in eye-centered coordinates. Science 285, 257–260 (1999).

    CAS  Article  Google Scholar 

  28. 28

    Battaglia-Mayer, A. et al. Early coding of reaching in the parietooccipital cortex. J. Neurophysiol. 83, 2374–2391 (2000).

    CAS  Article  Google Scholar 

  29. 29

    Graziano, M.S., Cooke, D.F. & Taylor, C.S. Coding the location of the arm by sight. Science 290, 1782–1786 (2000).

    CAS  Article  Google Scholar 

  30. 30

    Bremner, L.R. & Andersen, R.A. Coding of the reach vector in parietal area 5d. Neuron 75, 342–351 (2012).

    CAS  Article  Google Scholar 

  31. 31

    Deneve, S., Latham, P.E. & Pouget, A. Efficient computation and cue integration with noisy population codes. Nat. Neurosci. 4, 826–831 (2001).

    CAS  Article  Google Scholar 

  32. 32

    Ma, W.J., Beck, J.M., Latham, P.E. & Pouget, A. Bayesian inference with probabilistic population codes. Nat. Neurosci. 9, 1432–1438 (2006).

    CAS  Article  Google Scholar 

  33. 33

    Chang, S.W. & Snyder, L.H. Idiosyncratic and systematic aspects of spatial representations in the macaque parietal cortex. Proc. Natl. Acad. Sci. USA 107, 7951–7956 (2010).

    CAS  Article  Google Scholar 

  34. 34

    Marzocchi, N., Breveglieri, R., Galletti, C. & Fattori, P. Reaching activity in parietal area V6A of macaque: eye influence on arm activity or retinocentric coding of reaching movements? Eur. J. Neurosci. 27, 775–789 (2008).

    Article  Google Scholar 

  35. 35

    McGuire, L.M. & Sabes, P.N. Heterogeneous representations in the superior parietal lobule are common across reaches to visual and proprioceptive targets. J. Neurosci. 31, 6661–6673 (2011).

    CAS  Article  Google Scholar 

  36. 36

    Wise, S.P., Boussaoud, D., Johnson, P.B. & Caminiti, R. Premotor and parietal cortex: corticocortical connectivity and combinatorial computations. Annu. Rev. Neurosci. 20, 25–42 (1997).

    CAS  Article  Google Scholar 

  37. 37

    Andersen, R.A. & Buneo, C.A. Intentional maps in posterior parietal cortex. Annu. Rev. Neurosci. 25, 189–220 (2002).

    CAS  Article  Google Scholar 

  38. 38

    Battaglia–Mayer, A., Caminiti, R., Lacquaniti, F. & Zago, M. Multiple levels of representation of reaching in the parieto-frontal network. Cereb. Cortex 13, 1009–1022 (2003).

    Article  Google Scholar 

  39. 39

    Sabes, P.N. Sensory integration for reaching: models of optimality in the context of behavior and the underlying neural circuits. Prog. Brain Res. 191, 195–209 (2011).

    Article  Google Scholar 

  40. 40

    Yttri, E.A., Liu, Y. & Snyder, L.H. Lesions of cortical area LIP affect reach onset only when the reach is accompanied by a saccade, revealing an active eye–hand coordination circuit. Proc. Natl. Acad. Sci. USA 110, 2371–2376 (2013).

    CAS  Article  Google Scholar 

  41. 41

    Levy, I., Schluppeck, D., Heeger, D.J. & Glimcher, P.W. Specificity of human cortical areas for reaches and saccades. J. Neurosci. 27, 4687–4696 (2007).

    CAS  Article  Google Scholar 

  42. 42

    Pearson, R.C. & Powell, T.P. The cortico-cortical connections to area 5 of the parietal lobe from the primary somatic sensory cortex of the monkey. Proc. R. Soc. Lond. B Biol. Sci. 200, 103–108 (1978).

    CAS  Article  Google Scholar 

  43. 43

    Lewis, J.W. & Van Essen, D.C. Corticocortical connections of visual, sensorimotor, and multimodal processing areas in the parietal lobe of the macaque monkey. J. Comp. Neurol. 428, 112–137 (2000).

    CAS  Article  Google Scholar 

  44. 44

    Redding, G.M. & Wallace, B. Strategic calibration and spatial alignment: a model from prism adaptation. J. Mot. Behav. 34, 126–138 (2002).

    Article  Google Scholar 

  45. 45

    van Beers, R.J., Sittig, A.C. & Denier van der Gon, J.J. How humans combine simultaneous proprioceptive and visual position information. Exp. Brain Res. 111, 253–261 (1996).

    CAS  Article  Google Scholar 

  46. 46

    Ghez, C. & Sainburg, R. Proprioceptive control of interjoint coordination. Can. J. Physiol. Pharmacol. 73, 273–284 (1995).

    CAS  Article  Google Scholar 

  47. 47

    Novak, K.E., Miller, L.E. & Houk, J.C. The use of overlapping submovements in the control of rapid hand movements. Exp. Brain Res. 144, 351–364 (2002).

    CAS  Article  Google Scholar 

  48. 48

    Efron, B. & Tibshirani, R.J. An Introduction to the Bootstrap (Chapman & Hall, 1993).

Download references

Acknowledgements

We thank M.R. Fellows for initial behavioral training; R.R. Torres for suggesting the 2AFC task for ICMS detection; A. Leggitt for help with data analysis; K.B. Andrews and K. MacLeod for animal-related support; and J.G. Makin, A. Yazdan-Shahmorad and T.L. Hanson for discussion and comments on the manuscript. This research was supported by the Defense Advanced Research Projects Agency (DARPA) Reorganization and Plasticity to Accelerate Injury Recovery (REPAIR; N66001-10-C-2010) and the US National Institutes of Health NEI (EY015679, EY007120).

Author information

Affiliations

Authors

Contributions

M.C.D. and P.N.S. designed the experiments; M.C.D. and J.E.O. developed and tested multielectrode stimulation capabilities, including behavioral validation; M.C.D. performed the experiments; M.C.D. and P.N.S. analyzed the data; M.C.D., P.N.S. and J.E.O. wrote the manuscript.

Corresponding author

Correspondence to Philip N Sabes.

Ethics declarations

Competing interests

The authors declare no competing financial interests.

Integrated supplementary information

Supplementary Figure 1 Virtual reality environment.

Animals sit in a virtual reality environment without direct view of the arm. A mirror reflects images on rear–project screen and is adjusted so that visual cues appear in the horizontal plane of the reaching hand. Hand position was tracked electromagnetically (Polhemus Liberty, Colchester, VT), and feedback about the position of the hand, relative to the target, was delivered via a random–dot visual flow field (inset) or via patterned ICMS.

Supplementary Figure 2 Physiological properties of stimulated somatosensory cortex.

Location of electrode arrays within S1 (right) and example neuronal receptive fields (left) for Monkey D (top) and Monkey F (bottom). Colored circles (right) indicate array locations corresponding to the matching colored receptive fields (left). Neurons responded mainly to light touch; circles with dark borders correspond to cells that responded to limb movements (active and passive).

Supplementary Figure 3 Evolution of performance over training.

Behavioral performance measures are shown as a function of the cumulative number of VIS+ICMS trials performed (training and testing) for Monkey D. The data, collected during testing sessions, were smoothed for clarity (Gaussian window with standard deviation of 2.8 training sessions, translating to approximately 2,500 training trials for Monkey D). The visual coherence on training trials was decreased across training sessions (indicated by gray bars at the bottom of the figure and vertical gray lines at the transitions). The left, thin green line denotes the onset of ICMS–only trials, where target sizes were temporarily larger than in the other trial conditions; the right, thick green line denotes the beginning of ICMS-trials with targets of standard size. (a) percent correct trials; (b) number of movement segments measured online error corrections; (c) movement time for the trial is normalized by the initial distance to the reach target; (d) path length, normalized as in c. See Supplementary Table 1 for additional details on the training and testing schedule.

Supplementary Figure 4 Sample movement paths.

Sample movement paths from randomly selected successful trials for Monkeys D (a) and F (b) for seven feedback conditions. Each reach begins at the fixed central starting point and ends within the unseen reach target (here depicted in gray for clarity).

Supplementary Figure 5 Additional analyses of initial angle.

(a), Standard deviation of initial angle (relative to target angle) for the different trial types and visual coherences. Plots follow the same conventions as Figure 6a in the main text. This figure demonstrates that the qualitative results of the main text—in particular the good correspondence with the minimum variance model of sensory integration—are not an artifact of the subtracting the smoothed estimates of mean initial angle used there. Error bars denote standard error of the mean. (b) Smoothed mean initial angle for Monkey F, relative to the target angle. Monkey F did not exhibit the marked differences in mean initial angle across feedback types that were observed for Monkey D (Figure 6b, main text). (c) Visual cue weighting for Monkey F in the VIS+ICMS trials, as a function of dot–field coherence. The results are consistent with the minimum variance model, however the analysis has poor statistical power due to the similarity in mean initial angle across feedback types. Blue filled circles: visual cue weighting estimated from data; black unfilled circles: minimum variance model prediction; error–bars: bootstrapped estimates of standard error.

Supplementary information

Supplementary Text and Figures

Supplementary Figures 1–5 and Supplementary Tables 1–4 (PDF 1044 kb)

Supplementary Methods Checklist (PDF 347 kb)

Sample trial of monkey F reaching with VIS+ICMS feedback.

This video is generated from behavioral data collected on 29 Dec., 2013. The left panel shows a simulated version of a portion of the virtual reality environment that the monkey viewed during the trial: position of the fingertip (filled white circle), dot field (here displayed at 100% coherence for clarity; the coherence presented to the monkey for this trial was 50%), and start target (open green circle, radius 10 mm). The right panel shows the patterns of ICMS delivered during the trial, where each vertical line denotes a pulse of stimulation from an electrode with a preferred direction indicated by the corresponding red arrow at left. Stimulation rasters shown have been subsampled for clarity. In the video, the monkey is shown first acquiring the start target. After an instructed delay interval, during which VIS+ICMS information about the instructed movement vector become available, a go cue sounds (noted by text), and the monkey completes the reach to the unseen, 12 mm radius reach target (illustrated here with dashed white circle). (MP4 872 kb)

Sample trial of monkey F reaching with only ICMS feedback.

This video is generated from behavioral data collected on 29 Dec., 2013. The left panel shows a simulated version of a portion of the virtual reality environment that the monkey viewed during the trial: position of the fingertip (filled white circle), dot field (here displayed at 100% coherence for clarity; the coherence presented to the monkey for this trial was 50%), and start target (open green circle, radius 10 mm). The right panel shows the patterns of ICMS delivered during the trial, where each vertical line denotes a pulse of stimulation from an electrode with a preferred direction indicated by the corresponding red arrow at left. Stimulation rasters shown have been subsampled for clarity. In the video, the monkey is shown first acquiring the start target. After an instructed delay interval, during which ICMS information about the instructed movement vector become available, a go cue sounds (noted by text), and the monkey completes the reach to the unseen, 12 mm radius reach target (illustrated here with dashed white circle). (MP4 121 kb)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Dadarlat, M., O'Doherty, J. & Sabes, P. A learning-based approach to artificial sensory feedback leads to optimal integration. Nat Neurosci 18, 138–144 (2015). https://doi.org/10.1038/nn.3883

Download citation

Further reading

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing