Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Active mechanical haptics with high-fidelity perceptions for immersive virtual reality


Human-centred mechanical sensory perceptions enable us to immerse ourselves in the physical environment by actively touching or holding objects so that we may feel their existence (that is, ownership) and their fundamental properties (for example, stiffness or hardness). In a virtual environment, the replication of these active perceptions can create authentic haptic experiences, serving as an essential supplement for visual and auditory experiences. We present here a first-person, human-triggered haptic device enabled by curved origami that allows humans to actively experience touching of objects with various stiffness perceptions from soft to hard and from positive to negative ranges. This device represents a substantial shift away from the third-person, machine-triggered and passive haptics currently in use. The device is synchronized with the virtual environment by changing its configuration to adapt various interactions by emulating body-centred physical perceptions, including hardness, softness and sensations of crushing and weightlessness. Quantitative evaluations demonstrate that the active haptic device creates a highly immersive virtual environment, outperforming existing vibration-based passive devices. These concepts and resulting technologies create new opportunities and application potential for a more authentic virtual world.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Active stiffness perceptions in life scenarios and their reconstruction for immersive VR.
Fig. 2: Working principle using curved origami to tune stiffness.
Fig. 3: Closed-loop haptics based on curved origami.
Fig. 4: Haptic in-hand device and evaluation of its sensory perceptions.
Fig. 5: Haptic stepping device and evaluation of its sensory perceptions.

Similar content being viewed by others

Data availability

All data needed to evaluate the conclusions in the papers are present in the Article and/or the Supplementary Information. The data collected during the experiment with the volunteers can be downloaded from The data are available via Zenodo at (ref. 51).

Code availability

The code that supports the active mechanical haptic system within this paper and other findings of this study are available from The code is available via Zenodo at (ref. 51).


  1. Sparkes, M. What is a metaverse. New Sci. 251, 18 (2021).

    Google Scholar 

  2. Ackerman, J. M., Nocera, C. C. & Bargh, J. A. Incidental haptic sensations influence social judgments and decisions. Science 328, 1712–1715 (2010).

    Google Scholar 

  3. Yu, X. et al. Skin-integrated wireless haptic interfaces for virtual and augmented reality. Nature 575, 473–479 (2019).

    Google Scholar 

  4. Biswas, S. & Visell, Y. Haptic perception, mechanics, and material technologies for virtual reality. Adv. Funct. Mater. 31, 2008186 (2021).

    Google Scholar 

  5. Pyun, K. R., Rogers, J. A. & Ko, S. H. Materials and devices for immersive virtual reality. Nat. Rev. Mater. 7, 841–843 (2022).

    Google Scholar 

  6. Li, S., Bai, H., Shepherd, R. F. & Zhao, H. Bio-inspired design and additive manufacturing of soft materials, machines, robots, and haptic interfaces. Angew. Chem. Int. Ed. 58, 11182–11204 (2019).

    Google Scholar 

  7. Jung, Y. H., Kim, J.-H. & Rogers, J. A. Skin-integrated vibrohaptic interfaces for virtual and augmented reality. Adv. Funct. Mater. 31, 2008805 (2021).

    Google Scholar 

  8. Zhu, M. L. et al. Haptic-feedback smart glove as a creative human–machine interface (HMI) for virtual/augmented reality applications. Sci. Adv. 6, eaaz8693 (2020).

    Google Scholar 

  9. Liu, Y. et al. Electronic skin as wireless human–machine interfaces for robotic VR. Sci. Adv. 8, eabl6700 (2022).

    Google Scholar 

  10. Jung, Y. H. et al. A wireless haptic interface for programmable patterns of touch across large areas of the skin. Nat. Electron. 5, 374–385 (2022).

  11. Sun, Z., Zhu, M., Shan, X. & Lee, C. Augmented tactile-perception and haptic-feedback rings as human–machine interfaces aiming for immersive interactions. Nat. Commun. 13, 5224 (2022).

    Google Scholar 

  12. Kim, D. et al. Actuating compact wearable augmented reality devices by multifunctional artificial muscle. Nat. Commun. 13, 4155 (2022).

    Google Scholar 

  13. Choi, I., Ofek, E., Benko, H., Sinclair, M. & Holz, C. CLAW: a multifunctional handheld haptic controller for grasping, touching, and triggering in virtual reality. In CHI '18: Proc. 2018 CHI Conference on Human Factors in Computing Systems 654 (Association for Computing Machinery, 2018).

  14. Scheggi, S., Meli, L., Pacchierotti, C. & Prattichizzo, D. Touch the virtual reality: using the leap motion controller for hand tracking and wearable tactile devices for immersive haptic rendering. In ACM SIGGRAPH 2015 Posters 31 (Association for Computing Machinery, 2015).

  15. Giraud, F. H., Joshi, S. & Paik, J. Haptigami: a fingertip haptic interface with vibrotactile and 3-DoF cutaneous force feedback. IEEE Trans. Haptics 15, 131–141 (2022).

    Google Scholar 

  16. Pezent, E., Agarwal, P., Hartcher-O’Brien, J., Colonnese, N. & O’Malley, M. K. Design, control, and psychophysics of Tasbi: a force-controlled multimodal haptic bracelet. IEEE Trans. Robot. 38, 2962–2978 (2022).

    Google Scholar 

  17. Yao, K. M. et al. Encoding of tactile information in hand via skin-integrated wireless haptic interface. Nat. Mach. Intell. 4, 893–903 (2022).

    Google Scholar 

  18. Leroy, E., Hinchet, R. & Shea, H. Multimode hydraulically amplified electrostatic actuators for wearable haptics. Adv. Mater. 32, 2002564 (2020).

    Google Scholar 

  19. Chinello, F., Pacchierotti, C., Malvezzi, M. & Prattichizzo, D. A three revolute–revolute–spherical wearable fingertip cutaneous device for stiffness rendering. IEEE Trans. Haptics 11, 39–50 (2018).

    Google Scholar 

  20. Xu, H., Peshkin, M. A. & Colgate, J. E. UltraShiver: lateral force feedback on a bare fingertip via ultrasonic oscillation and electroadhesion. IEEE Trans. Haptics 12, 497–507 (2019).

    Google Scholar 

  21. Steed, A., Ofek, E., Sinclair, M. & Gonzalez-Franco, M. A mechatronic shape display based on auxetic materials. Nat. Commun. 12, 4758 (2021).

    Google Scholar 

  22. Peng, Y. et al. Elastohydrodynamic friction of robotic and human fingers on soft micropatterned substrates. Nat. Mater. 20, 1707–1711 (2021).

    Google Scholar 

  23. Choi, C. et al. Surface haptic rendering of virtual shapes through change in surface temperature. Sci. Robot. 7, eabl4543 (2022).

    Google Scholar 

  24. Li, X. et al. Nanotexture shape and surface energy impact on electroadhesive human–machine interface performance. Adv. Mater. 33, 2008337 (2021).

    Google Scholar 

  25. Mintchev, S., Salerno, M., Cherpillod, A., Scaduto, S. & Paik, J. A portable three-degrees-of-freedom force feedback origami robot for human–robot interactions. Nat. Mach. Intell. 1, 584–593 (2019).

    Google Scholar 

  26. Pacchierotti, C. et al. Wearable haptic systems for the fingertip and the hand: taxonomy, review, and perspectives. IEEE Trans. Haptics 10, 580–600 (2017).

    Google Scholar 

  27. Xiong, Q. et al. So-EAGlove: VR haptic glove rendering softness sensation with force-tunable electrostatic adhesive brakes. IEEE Trans. Robot. 38, 3450–3462 (2022).

  28. Giri, G. S., Maddahi, Y. & Zareinia, K. An application-based review of haptics technology. Robotics 10, 29 (2021).

    Google Scholar 

  29. Choi, I., Hawkes, E. W., Christensen, D. L., Ploch, C. J. & Follmer, S. Wolverine: a wearable haptic interface for grasping in virtual reality. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2016) 986–993 (IEEE, 2016).

  30. Martinez, M. O. et al. Open source, modular, customizable, 3-D printed kinesthetic haptic devices. In 2017 IEEE World Haptics Conference (WHC) 142–147 (IEEE, 2017).

  31. Jadhav, S., Majit, M. R. A., Shih, B., Schulze, J. P. & Tolley, M. T. Variable stiffness devices using fiber jamming for application in soft robotics and wearable haptics. Soft Robot. 9, 173–186 (2022).

    Google Scholar 

  32. Ernst, M. O. & Banks, M. S. Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415, 429–433 (2002).

    Google Scholar 

  33. Moscatelli, A. et al. Touch as an auxiliary proprioceptive cue for movement control. Sci. Adv. 5, eaaw3121 (2019).

    Google Scholar 

  34. Kieliba, P., Clode, D., Maimon-Mor, R. O. & Makin, T. R. Robotic hand augmentation drives changes in neural body representation. Sci. Robot. 6, eabd7935 (2021).

    Google Scholar 

  35. Peck, J. & Shu, S. B. The effect of mere touch on perceived ownership. J. Consum. Res. 36, 434–447 (2009).

    Google Scholar 

  36. Brasel, S. A. & Gips, J. Tablets, touchscreens, and touchpads: how varying touch interfaces trigger psychological ownership and endowment. J. Consum. Psychol. 24, 226–233 (2014).

    Google Scholar 

  37. Felton, S. Origami for the everyday. Nat. Mach. Intell. 1, 555–556 (2019).

    Google Scholar 

  38. Rus, D. & Tolley, M. T. Design, fabrication and control of origami robots. Nat. Rev. Mater. 3, 101–112 (2018).

    Google Scholar 

  39. Hawkes, E. et al. Programmable matter by folding. Proc. Natl Acad. Sci. USA 107, 12441–12445 (2010).

    Google Scholar 

  40. Zhai, Z., Wang, Y. & Jiang, H. Origami-inspired, on-demand deployable and collapsible mechanical metamaterials with tunable stiffness. Proc. Natl Acad. Sci. USA 115, 2032–2037 (2018).

    Google Scholar 

  41. Melancon, D., Gorissen, B., Garcia-Mora, C. J., Hoberman, C. & Bertoldi, K. Multistable inflatable origami structures at the metre scale. Nature 592, 545–550 (2021).

    Google Scholar 

  42. Felton, S., Tolley, M., Demaine, E., Rus, D. & Wood, R. A method for building self-folding machines. Science 345, 644–646 (2014).

    Google Scholar 

  43. Faber, J. A., Arrieta, A. F. & Studart, A. R. Bioinspired spring origami. Science 359, 1386–1391 (2018).

    Google Scholar 

  44. Wu, S. et al. Stretchable origami robotic arm with omnidirectional bending and twisting. Proc. Natl Acad. Sci. USA 118, e2110023118 (2021).

  45. Zhai, Z., Wang, Y., Lin, K., Wu, L. & Jiang, H. In situ stiffness manipulation using elegant curved origami. Sci. Adv. 6, eabe2000 (2020).

    Google Scholar 

  46. Su, H. et al. Physical human–robot interaction for clinical care in infectious environments. Nat. Mach. Intell. 3, 184–186 (2021).

    Google Scholar 

  47. Li, G. et al. Self-powered soft robot in the Mariana Trench. Nature 591, 66–71 (2021).

    Google Scholar 

  48. Panzirsch, M. et al. Exploring planet geology through force-feedback telemanipulation from orbit. Sci. Robot. 7, eabl6307 (2022).

    Google Scholar 

  49. Liu, K., Pratapa, P. P., Misseroni, D., Tachi, T. & Paulino, G. H. Triclinic metamaterials by tristable origami with reprogrammable frustration. Adv. Mater. 34, 2107998 (2022).

    Google Scholar 

  50. Song, Z. et al. Origami lithium-ion batteries. Nat. Commun. 5, 3140 (2014).

    Google Scholar 

  51. Zhang, Z. et al. Active mechanical haptics with high-fidelity perceptions for immersive virtual reality. Zenodo (2023).

Download references


We thank the Research Center for Industries of the Future (RCIF) at Westlake University and Westlake Education Foundation for supporting this work. Z. Zhang acknowledges support from the National Natural Science Foundations of China (grant 52205031). Y.W. acknowledges support from the National Natural Science Foundation of China (grants 11872328 and 12132013). L.K. acknowledges support from the Key Project of Zhejiang Lab (G2021NB0AL03) and the National Natural Science Foundations of China (grant 52205034). We also thank Y. Jiang for helping with human experiments and data analysis, and Y. Huang for helping with the design of the electronic and control system.

Author information

Authors and Affiliations



Z. Zhang, Z.X. and H.J. developed the concept. Z. Zhang, Z.X., L.E. and P.W. designed and prototyped the devices. Z. Zhang and L.E. developed the electronics, the control system and the software. Z. Zhang, Z.X., L.E., P.W., S.C., Z. Zhai, Y.W. and H.J. carried out experiments and analysis. Z. Zhang, Z.X., L.E. and L.K. collected the user data. Z. Zhang, Z.X. and H.J. wrote the manuscript.

Corresponding author

Correspondence to Hanqing Jiang.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Machine Intelligence thanks Xinge Yu and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Extended data

Extended Data Fig. 1 Some force–displacement relationships of curved origami modules with different angles α.

a-c, normalized force–displacement relationship of curved origami modules with different initial folding angles β with angles α = 50°, 60°, 70°. d-f, normalized force–displacement relationship of curved origami modules with fixed initial folding angles β = 120° and different controllable angles β, with different angles α = 50°, 60°, 70°.

Extended Data Fig. 2 Fabrication of the curved origami.

a, Fabrication of curved origami using plastic or metal materials: step a1, sheets cutting according to the 2D curved origami pattern, using an engraving machine; step a2, peel off the origami patterns from the substrate; step a3, manually fold the 2D pattern along the curved crease to form the 3D configuration with bending panels; step a4, integrate a cable through two holes on the panels; step a5, knotting the cable behind one side of the origami panel to form motion constraint; step a6, pull and release the cable to control the curved origami folding. b, Fabrication of the curved origami with the silver nanowires (AgNWs) coated layer for sensing: step b1, prepare the substrate and AgNWs suspension; step b2, pour the AgNWs suspension onto the surface of the substrate; step b3, dry the suspension in the oven for 6 hours to obtain the AgNWs-coated substrate; steps b4-b9, similar fabrication process with steps a1-a6.

Extended Data Fig. 3 SEM images of the AgNWs-coated origami.

SEM images of AgNWs coated on the surface of a PET film under initial (a), outward (b), and inward (c) bending at low (up) and high (bottom) magnitude. The scale bars are 2 μm (in yellow) and 1 μm (in red), respectively. Outward bending shows loosely packed structure, while inward bending shows densely packed structure, as compared with the initial state, indicating resistance increase for outward bending and decrease for inward bending.

Extended Data Fig. 4 Mechanical structure of the in-hand haptic device.

a, Exploded-view schematic illustration of the spherical in-hand device with five synchronously controlled origami buttons. The slide guide is utilized to confine the compression range (10 mm) of the origami button. The universal joints are used for transmitting the rotations about the vertical axis between the top and bottom plates of curved origami modules. b, Schematic of the actuation system. The cable is actuated by an on-board micromotor through worm gears. A tension roller is used for the pre-tension of the cable, to avoid the slide between the cable and the rollers. c, Synchronous actuation of the origami buttons. A SI-MO (single input–multiple output) actuation strategy is constructed based on cable routing. The red arrow denotes the driving roller, and the blue ones denote follow-up rollers.

Extended Data Fig. 5 Stiffness manipulation of the in-hand device.

a, Synchronous control of the folding angles of the curved origami buttons. b-e, Optical images and the corresponding force–displacement relationships of a curved origami button with folding angle β = 120° under different control angles β = 0°, 30°, 60°, and 90°.

Extended Data Fig. 6 Recorded EMG of the upper limb.

a-d, Raw EMG data with a sampling frequency of 2,000 Hz and the corresponding RMS value with a sampling interval of 0.25 s. The data were recorded when users tried to grasp four objects with different stiffness under four different scenarios, including grasping real objects (a), grasping the present haptic device (b), grasping virtual objects through hand gestures (c), and grasping a conventional joystick (d).

Extended Data Fig. 7 Mechanical structure of the stepping haptic device.

a, Exploded-view schematic illustration of the stepping device with synchronously controlled curved origami tessellation. b, Schematic of the multihead worm gears transmission. The cable is actuated by a motor through worm gears. Four gears are synchronously controlled by one four-head worm, making four cables synchronously pulled/released by the motor. c, Schematic of the multiknotted cable-driven transmission. Five knots are evenly made on each cable, rending the folding of five origami modules synchronously controlled by one cable. The red arrow denotes the input, and the blue ones denote the follow-ups. d, Synchronous actuation of the origami tessellation.

Extended Data Fig. 8 Snapshots of the immersive whole-body haptic experiences on the stepping device with various virtual scenarios.

a, The user experienced low-value positive stiffness in the scenario of grassland, with his feet readily sinking into the virtual ground. b, The user experienced negative stiffness in the scenario of an icy surface, with his whole body keeping still upon a light active stepping while experiencing a real falling upon a hard active stepping. c, The user experienced high-value positive stiffness in the scenario of a rigid avenue, with his whole body keeping still upon active stepping.

Extended Data Fig. 9 Stiffness manipulation of the stepping device.

a, Synchronous control of the folding angles of the curved origami tessellation. b-d, Optical images of the stepping processes and the corresponding force–displacement relationships of a curved origami module inside the stepping device with folding angle β = 120° under different control angles β = 0°, 35°, and 60°. Solid lines are measured results and dash lines denote theoretical ones.

Extended Data Fig. 10 Recorded EMG of the lower limb.

a-c, Raw EMG data with a sampling frequency of 2,000 Hz and the corresponding RMS value with a sampling interval of 0.25 s. The data were recorded when users stepped on the stepping device with three different stiffness, including high-value positive stiffness simulating a rigid avenue (a), negative stiffness simulating an icy surface (b), and low-value positive stiffness simulating a soft grassland (c).

Supplementary information

Supplementary Information

Supplementary Notes 1–3 and Figs. 1–10.

Reporting Summary

Supplementary Video 1

Active mechanical haptics with the in-hand device.

Supplementary Video 2

Stiffness manipulation of the in-hand device.

Supplementary Video 3

Active mechanical haptics with the stepping device.

Supplementary Video 4

Stiffness manipulation of the stepping device.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, Z., Xu, Z., Emu, L. et al. Active mechanical haptics with high-fidelity perceptions for immersive virtual reality. Nat Mach Intell 5, 643–655 (2023).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:

This article is cited by


Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing