Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Resource
  • Published:

Ouvrai opens access to remote virtual reality studies of human behavioural neuroscience


Modern virtual reality (VR) devices record six-degree-of-freedom kinematic data with high spatial and temporal resolution and display high-resolution stereoscopic three-dimensional graphics. These capabilities make VR a powerful tool for many types of behavioural research, including studies of sensorimotor, perceptual and cognitive functions. Here we introduce Ouvrai, an open-source solution that facilitates the design and execution of remote VR studies, capitalizing on the surge in VR headset ownership. This tool allows researchers to develop sophisticated experiments using cutting-edge web technologies such as WebXR to enable browser-based VR, without compromising on experimental design. Ouvrai’s features include easy installation, intuitive JavaScript templates, a component library managing front- and backend processes and a streamlined workflow. It integrates with Firebase, Prolific and Amazon Mechanical Turk and provides data processing utilities for analysis. Unlike other tools, Ouvrai remains free, with researchers managing their web hosting and cloud database via personal Firebase accounts. Ouvrai is not limited to VR studies; researchers can also develop and run desktop or touchscreen studies using the same streamlined workflow. Through three distinct motor learning experiments, we confirm Ouvrai’s efficiency and viability for conducting remote VR studies.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: A schematic of Ouvrai infrastructure.
Fig. 2: A portion of the FSM for experiment 1, represented as a flow chart.
Fig. 3: Experiment 1: spontaneous recovery.
Fig. 4: Experiment 2: dual adaptation with control points.
Fig. 5: Experiment 3: 3D generalization of visuomotor learning.
Fig. 6: Additional features of Ouvrai.

Similar content being viewed by others

Data availability

Raw data in JSON format, preprocessed data in XLSX format and Python analysis code are available on GitHub for experiment 1, experiment 2 and experiment 3. Analysis code relies on the Python data analysis utilities that come with the Ouvrai source code on GitHub.

Code availability

The production version of the study code as well as Python analysis code for experiments 1–3 is available in Jupyter notebooks at the following links: experiment 1, experiment 2 and experiment 3. The complete source code of Ouvrai is available at The Ouvrai GitHub repository will be updated as new features are developed. Previous releases can be accessed via the commit history of the main branch.


  1. Tarr, M. J. & Warren, W. H. Virtual reality in behavioral neuroscience and beyond. Nat. Neurosci. 5, 1089–1092 (2002).

    Article  CAS  PubMed  Google Scholar 

  2. Bohil, C. J., Alicea, B. & Biocca, F. A. Virtual reality in neuroscience research and therapy. Nat. Rev. Neurosci. 12, 752–762 (2011).

    Article  CAS  PubMed  Google Scholar 

  3. Tieri, G., Morone, G., Paolucci, S. & Iosa, M. Virtual reality in cognitive and motor rehabilitation: facts, fiction and fallacies. Expert Rev. Med. Devices 15, 107–117 (2018).

    Article  CAS  PubMed  Google Scholar 

  4. Johnson, B. P., Dayan, E., Censor, N. & Cohen, L. G. Crowdsourcing in cognitive and systems neuroscience. Neuroscientist 28, 425–437 (2022).

    Article  PubMed  Google Scholar 

  5. Cesanek, E., Zhang, Z., Ingram, J. N., Wolpert, D. M. & Flanagan, J. R. Motor memories of object dynamics are categorically organized. eLife 10, e71627 (2021).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  6. Zhang, Z., Cesanek, E., Ingram, J. N., Flanagan, J. R. & Wolpert, D. M. Object weight can be rapidly predicted, with low cognitive load, by exploiting learned associations between the weights and locations of objects. J. Neurophysiol. 129, 285–297 (2023).

    Article  PubMed  Google Scholar 

  7. Cesanek, E., Flanagan, J. R. & Wolpert, D. M. Memory, perceptual, and motor costs affect the strength of categorical encoding during motor learning of object properties. Sci. Rep. 13, 8619 (2023).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  8. Tsay, J. S., Lee, A., Ivry, R. B. & Avraham, G. Moving outside the lab: the viability of conducting sensorimotor learning studies online. Preprint at (2021).

  9. Tsay, J. S., Haith, A. M., Ivry, R. B. & Kim, H. E. Interactions between sensory prediction error and task error during implicit motor learning. PLoS Comput. Biol. 18, e1010005 (2022).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  10. Tsay, J. et al. Large-scale citizen science reveals predictors of sensorimotor adaption. Nat. Hum. Behav. 8, 510–525 (2024).

    Article  PubMed  Google Scholar 

  11. Listman, J. B., Tsay, J. S., Kim, H. E., Mackey, W. E. & Heeger, D. J. Long-term motor learning in the ‘wild’ with high volume video game data. Front. Hum. Neurosci. 15, 777779 (2021).

    Article  PubMed  PubMed Central  Google Scholar 

  12. Kim, O. A., Forrence, A. D. & McDougle, S. D. Motor learning without movement. Proc. Natl Acad. Sci. USA 119, e2204379119 (2022).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  13. Bönstrup, M., Iturrate, I., Hebart, M. N., Censor, N. & Cohen, L. G. Mechanisms of offline motor learning at a microscale of seconds in large-scale crowdsourced data. NPJ Sci. Learn. 5, 7 (2020).

    Article  PubMed  PubMed Central  Google Scholar 

  14. Parsons, T. D. Virtual reality for enhanced ecological validity and experimental control in the clinical, affective and social neurosciences. Front. Hum. Neurosci. 9, 660 (2015).

    Article  PubMed  PubMed Central  Google Scholar 

  15. Boland, M. How many VR headsets did meta sell in Q4? AR Insider (2023).

  16. Heath, A. This is Meta’s AR/VR hardware roadmap through 2027. The Verge (2023).

  17. ARtillery Briefs, episode 62: VR usage & consume attitudes ARtillery Intelligence (2022).

  18. VR usage & consumer attitudes, wave VI ARtillery Intelligence (2022).

  19. Evans, J., Tsaneva-Atanasova, K. & Buckingham, G. Using immersive virtual reality to remotely examine performance differences between dominant and non-dominant hands. Virtual Real. 27, 2211–2226 (2023).

    Article  Google Scholar 

  20. Jones, B., Goregaokar, M. & Cabanier, R. WebXR Device API W3C (2023).

  21. Firebase pricing. (Google, 2024).

  22. Peer, E., Rothschild, D. & Gordon, A. Behavioral Lab 3.0: towards the next generation of online behavioral research. Preprint at PsyArXiv (2023).

  23. Peer, E., Rothschild, D., Gordon, A., Evernden, Z. & Damer, E. Data quality of platforms and panels for online behavioral research. Behav. Res. Methods 54, 1643–1662 (2022).

    Article  PubMed  Google Scholar 

  24. Warburton, M., Mon-Williams, M., Mushtaq, F. & Morehead, J. Measuring motion-to-photon latency for sensorimotor experiments with virtual reality systems. Behav. Res. Methods 55, 3658–3678 (2022).

    Article  PubMed  PubMed Central  Google Scholar 

  25. Smith, M. A., Ghazizadeh, A. & Shadmehr, R. Interacting adaptive processes with different timescales underlie short-term motor learning. PLoS Biol. 4, e179 (2006).

    Article  PubMed  PubMed Central  Google Scholar 

  26. McDougle, S. D., Bond, K. M. & Taylor, J. A. Explicit and implicit processes constitute the fast and slow processes of sensorimotor learning. J. Neurosci. 35, 9568–9579 (2015).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  27. Heald, J. B., Ingram, J. N., Flanagan, J. R. & Wolpert, D. M. Multiple motor memories are learned to control different points on a tool. Nat. Hum. Behav. 2, 300–311 (2018).

    Article  PubMed  PubMed Central  Google Scholar 

  28. Russo, M. et al. Intercepting virtual balls approaching under different gravity conditions: evidence for spatial prediction. J. Neurophysiol. 118, 2421–2434 (2017).

    Article  PubMed  PubMed Central  Google Scholar 

  29. Gonzalez-Franco, M., Cohn, B., Ofek, E., Burin, D. & Maselli, A. The self-avatar follower effect in virtual reality. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) 18–25 (IEEE, 2020).

  30. Maselli, A., De Pasquale, P., Lacquaniti, F. & d’Avella, A. Interception of virtual throws reveals predictive skills based on the visual processing of throwing kinematics. iScience. 25, 105212 (2022).

    Article  PubMed  PubMed Central  Google Scholar 

  31. Raghavan, P., Santello, M., Gordon, A. M. & Krakauer, J. W. Compensatory motor control after stroke: an alternative joint strategy for object-dependent shaping of hand posture. J. Neurophysiol. 103, 3034–3043 (2010).

    Article  PubMed Central  Google Scholar 

  32. Klein, L. K., Maiello, G., Paulun, V. C. & Fleming, R. W. Predicting precision grip grasp locations on three-dimensional objects. PLoS Comput. Biol. 16, e1008081 (2020).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  33. Ariani, G. & Diedrichsen, J. Sequence learning is driven by improvements in motor planning. J. Neurophysiol. 121, 2088–2100 (2019).

    Article  PubMed  PubMed Central  Google Scholar 

  34. Ingram, J. N., Körding, K. P., Howard, I. S. & Wolpert, D. M. The statistics of natural hand movements. Exp. Brain Res. 188, 223–236 (2008).

    Article  PubMed  PubMed Central  Google Scholar 

  35. Ranganathan, R., Adewuyi, A. & Mussa-Ivaldi, F. A. Learning to be lazy: exploiting redundancy in a novel task to minimize movement-related effort. J. Neurosci. 33, 2754–2760 (2013).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  36. Kieliba, P., Clode, D., Maimon-Mor, R. O. & Makin, T. R. Robotic hand augmentation drives changes in neural body representation. Sci. Robot. 6, eabd7935 (2021).

    Article  PubMed  PubMed Central  Google Scholar 

  37. Cannon, A. R., Wilson, C., Goregaokar, M. & Smith, T. F. W3C immersive web community and working groups face to face 2nd day (Seattle). W3C (2020).

  38. Eye tracking #79. GitHub (2022).

  39. Cabanier, R. WebXR expression tracking – level 1. GitHub (2022).

  40. Unity – Manual: XR. Unity Technologies (2023).

  41. Project Flowerbed: a WebXR case study. Meta (2023).

  42. Brookes, J., Warburton, M., Alghadier, M., Mon-Williams, M. & Mushtaq, F. Studying human behavior with virtual reality: the Unity Experiment Framework. Behav. Res. Methods 52, 455–463 (2020).

    Article  PubMed  Google Scholar 

  43. Bebko, A. O. & Troje, N. F. bmlTUX: design and control of experiments in virtual reality and beyond. i-Perception 11, 4 (2020).

    Article  Google Scholar 

  44. Schuetz, I., Karimpur, H. & Fiehler, K. vexptoolbox: a software toolbox for human behavior studies using the Vizard virtual reality platform. Behav. Res. Methods 55, 570–582 (2023).

    Article  PubMed  Google Scholar 

  45. Canning, C. G. et al. Virtual reality in research and rehabilitation of gait and balance in Parkinson disease. Nat. Rev. Neurol. 16, 409–425 (2020).

    Article  PubMed  Google Scholar 

  46. Nuara, A. et al. Telerehabilitation in response to constrained physical distance: an opportunity to rethink neurorehabilitative routines. J. Neurol. 269, 627–638 (2022).

    Article  CAS  PubMed  Google Scholar 

  47. Heald, J. B., Lengyel, M. & Wolpert, D. M. Contextual inference underlies the learning of sensorimotor repertoires. Nature 600, 489–493 (2021).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  48. Brashers-Krug, T., Shadmehr, R. & Bizzi, E. Consolidation in human motor memory. Nature 382, 252–255 (1996).

    Article  CAS  PubMed  Google Scholar 

  49. Caithness, G. et al. Failure to consolidate the consolidation theory of learning for sensorimotor adaptation tasks. J. Neurosci. 24, 8662–8671 (2004).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  50. Gandolfo, F., Mussa-Ivaldi, F. A. & Bizzi, E. Motor learning by field approximation. Proc. Natl Acad. Sci. USA 93, 3843–3846 (1996).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  51. Karniel, A. & Mussa-Ivaldi, F. A. Does the motor control system use multiple models and context switching to cope with a variable environment? Exp. Brain Res. 143, 520–524 (2002).

    Article  CAS  PubMed  Google Scholar 

  52. Krakauer, J. W., Ghilardi, M. F. & Ghez, C. Independent learning of internal models for kinematic and dynamic control of reaching. Nat. Neurosci. 2, 1026–1031 (1999).

    Article  CAS  PubMed  Google Scholar 

  53. Nozaki, D., Kurtzer, I. & Scott, S. H. Limited transfer of learning between unimanual and bimanual skills within the same limb. Nat. Neurosci. 9, 1364–1366 (2006).

    Article  CAS  PubMed  Google Scholar 

  54. Howard, I. S., Ingram, J. N. & Wolpert, D. M. Context-dependent partitioning of motor learning in bimanual movements. J. Neurophysiol. 104, 2082–2091 (2010).

    Article  PubMed  PubMed Central  Google Scholar 

  55. Howard, I. S., Ingram, J. N. & Wolpert, D. M. Separate representations of dynamics in rhythmic and discrete movements: evidence from motor learning. J. Neurophysiol. 105, 1722–1731 (2011).

    Article  PubMed  PubMed Central  Google Scholar 

  56. Howard, I. S., Wolpert, D. M. & Franklin, D. W. The effect of contextual cues on the encoding of motor memories. J. Neurophysiol. 109, 2632–2644 (2013).

    Article  PubMed  PubMed Central  Google Scholar 

  57. Sheahan, H. R., Franklin, D. W. & Wolpert, D. M. Motor planning, not execution, separates motor memories. Neuron 92, 773–779 (2016).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  58. Alhussein, L. & Smith, M. A. Motor planning under uncertainty. eLife 10, e67019 (2021).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  59. Sheahan, H. R., Ingram, J. N., Žalalytė, G. M. & Wolpert, D. M. Imagery of movements immediately following performance allows learning of motor skills that interfere. Sci. Rep. 8, 14330 (2018).

    Article  PubMed  PubMed Central  Google Scholar 

  60. Ghahramani, Z. & Wolpert, D. M. Modular decomposition in visuomotor learning. Nature 386, 392–395 (1997).

    Article  CAS  PubMed  Google Scholar 

  61. Witney, A. G. & Wolpert, D. M. Spatial representation of predictive motor learning. J. Neurophysiol. 89, 1837–1843 (2003).

    Article  PubMed  Google Scholar 

  62. Ingram, J. N., Howard, I. S., Flanagan, J. R. & Wolpert, D. M. Multiple grasp-specific representations of tool dynamics mediate skillful manipulation. Curr. Biol. 20, 618–623 (2010).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  63. Mattar, A. A. G. & Ostry, D. J. Modifiability of generalization in dynamics learning. J. Neurophysiol. 98, 3321–3329 (2007).

    Article  PubMed  Google Scholar 

  64. Krakauer, J. W., Pine, Z. M., Ghilardi, M. F. & Ghez, C. Learning of visuomotor transformations for vectorial planning of reaching trajectories. J. Neurosci. 20, 8916–8924 (2000).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  65. Vetter, P., Goodbody, S. J. & Wolpert, D. M. Evidence for an eye-centered spherical representation of the visuomotor map. J. Neurophysiol. 81, 935–939 (1999).

    Article  CAS  PubMed  Google Scholar 

  66. Shadmehr, R. Generalization as a behavioral window to the neural mechanisms of learning internal models. Hum. Mov. Sci. 23, 543–568 (2004).

    Article  PubMed  PubMed Central  Google Scholar 

Download references


This work was supported by the National Institutes of Health (R01NS117699 to D.M.W.) and the Air Force Office of Scientific Research under award (FA9550-22-1-0337 to D.M.W.). We thank Z. Zhang and A. Löffler for helpful discussions and testing of Ouvrai. The funders had no role in study design, data collection and analysis, decision to publish or preparation of the manuscript.

Author information

Authors and Affiliations



E.C., J.N.I. and D.M.W. conceived Ouvrai as a software toolbox for developing crowdsourced experiments on motor control and learning. E.C. created and developed Ouvrai, with assistance from S.S. and J.N.I. All authors conceived of the design of experiments 1–3. E.C. implemented, conducted and analysed experiments 1–3. All authors provided feedback on the presentation of the results, wrote and revised the manuscript.

Corresponding author

Correspondence to Evan Cesanek.

Ethics declarations

Competing interests

D.M.W. is a consultant to CTRL-Labs Inc., in the Reality Labs Division of Meta. This entity did not support or influence this work. The authors declare no other competing interests.

Peer review

Peer review information

Nature Human Behaviour thanks Gavin Buckingham, Marta Russo, and the other, anonymous, reviewer(s) for their contribution to the peer review of this work. Peer reviewer reports are available.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Supplementary Figs. 1–2, Firebase Realtime Database and FSM.

Reporting Summary

Peer Review File

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cesanek, E., Shivkumar, S., Ingram, J.N. et al. Ouvrai opens access to remote virtual reality studies of human behavioural neuroscience. Nat Hum Behav (2024).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI:


Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing