A portable three-degrees-of-freedom force feedback origami robot for human–robot interactions


Haptic interfaces can recreate the experience of touch and are necessary to improve human–robot interactions. However, at present, haptic interfaces are either electromechanical devices eliciting very limited touch sensations or devices that may provide more comprehensive kinesthetic cues but at the cost of their large volume: there is a clear trade-off between the richness of feedback and the device size. The design and manufacturing challenges in creating complex touch sensations from compact platforms still need to be solved. To overcome the physical limitation of miniaturizing force feedback robots, we adapted origami design principles to achieve portability, accuracy and scalable manufacturing. The result is Foldaway, a foldable origami robot that can render three-degrees-of-freedom force feedback in a compact platform that can fit in a pocket. This robotic platform can track the movement of the user’s fingers, apply a force of up to 2 newtons and render stiffness up to 1.2 newtons per millimetre. We experimented with different human–machine interactions to demonstrate the broad applicability of Foldaway prototypes: a portable interface for the haptic exploration of an anatomy atlas; a handheld joystick for interacting with virtual objects; and a bimanual controller for intuitive and safe teleoperation of drones.

Access options

Rent or Buy article

Get time limited or full article access on ReadCube.


All prices are NET prices.

Fig. 1: Foldaway haptic interfaces.
Fig. 2: Design of Foldaway haptic interfaces.
Fig. 3: Manufacture of Foldaway-Pushbutton origami mechanism.
Fig. 4: Result of force rendering and bandwidth tests of Foldaway prototypes.
Fig. 5: Haptic interface for exploration of images with haptic content.
Fig. 6: Pushbutton for interaction with objects in VR.
Fig. 7: A twin Pushbutton controller with force feedback for drone piloting.

Data availability

The data that support the findings of this study are available from the corresponding author upon reasonable request. The data collected during experiment with users can be downloaded from http://doi.org/10.5281/zenodo.2647727.

Code availability

The code that support the findings of this study are available from the corresponding author upon reasonable request.


  1. 1.

    Hayward, V., Astley, O. R., Cruz-Hernandez, M., Grant, D. & Robles-De-La-Torre, G. Haptic interfaces and devices. Sens. Rev. 24, 16–-29 (2004).

  2. 2.

    Culbertson, H., Schorr, S. B. & Okamura, A. M. Haptics: the present and future of artificial touch sensations. Annu. Rev. Control Robot. Auton. Syst. 1, 385–409 (2018).

  3. 3.

    Choi, S. & Kuchenbecker, K. J. Vibrotactile display: perception, technology, and applications. Proc. IEEE 101, 2093–2104 (2013).

  4. 4.

    Massie, T. H. & Salisbury, J. K. The PHANTOM haptic interface: a device for probing virtual objects. ASME Winter Annu. Meet. Symp. on Haptic Interf. Virtual Environ. Teleoperator Syst. 55, 1–6 (1994).

  5. 5.

    Martin, S. & Hillier, N. Characterisation of the Novint Falcon haptic device for application as a robot manipulator. In 2009 Australas. Conf. Robot. Autom. 1–9 (ARAA, 2009).

  6. 6.

    Birglen, L., Gosselin, C., Pouliot, N., Monsarrat, B. & Laliberté, T. SHaDe, A new 3-DOF haptic device. IEEE Trans. Robot. Autom. 18, 166–175 (2002).

  7. 7.

    Salisbury, K., Conti, F. & Barbagli, F. Haptic rendering: introductory concepts. IEEE Comput. Graph. Appl. 24, 24–32 (2004).

  8. 8.

    Duriez, C., Dubois, F., Kheddar, A. & Andriot, C. Realistic haptic rendering of interacting deformable objects in virtual environments. IEEE Trans. Vis. Comput. Graph. 12, 36–47 (2006).

  9. 9.

    MacLean, K. E. Haptic interaction design for everyday interfaces. Rev. Hum. Factors Ergon. 4, 149–194 (2008).

  10. 10.

    Okamura, A. M. Haptic feedback in robot-assisted minimally invasive surgery. Curr. Opin. Urology. 19, 102–107 (2009).

  11. 11.

    Chan, S., Conti, F., Salisbury, K. & Blevins, N. H. Virtual reality simulation in neurosurgerytechnologies and evolution. Neurosurgery 72, A154–A164 (2013).

  12. 12.

    Bolopion, A. & Régnier, S. A review of haptic feedback teleoperation systems for micromanipulation and microassembly. IEEE Trans. Autom. Sci. Eng. 10, 496–502 (2013).

  13. 13.

    Brantner, G. & Khatib, O. in Controlling Ocean One BT—Field and Service Robotics (eds Hutter, M. & Siegwart, R.) 3–17 (Springer, 2018).

  14. 14.

    Prattichizzo, D., Pacchierotti, C. & Rosati, G. Cutaneous force feedback as a sensory subtraction technique in haptics. IEEE Trans. Haptics 5, 289–300 (2012).

  15. 15.

    Chinello, F., Pacchierotti, C., Malvezzi, M. & Prattichizzo, D. A three revolute-revolute-spherical wearable fingertip cutaneous device for stiffness rendering. IEEE Trans. Haptics 11, 39–50 (2018).

  16. 16.

    Prattichizzo, D., Chinello, F., Pacchierotti, C. & Malvezzi, M. Towards wearability in fingertip haptics: a 3-DoF wearable device for cutaneous force feedback. IEEE Trans. Haptics 6, 506–516 (2013).

  17. 17.

    Schorr, S. B. & Okamura, A. M. Three-dimensional skin deformation as force substitution: wearable device design and performance during haptic exploration of virtual environments. IEEE Trans. Haptics 10, 418–430 (2017).

  18. 18.

    Solazzi, M., Frisoli, A. & Bergamasco, M. Design of a novel finger haptic interface for contact and orientation display. In 2010 IEEE Haptics Symp. 129–132 (IEEE, 2010).

  19. 19.

    Martinez, M. O. et al. 3-D printed haptic devices for educational applications. In 2016 IEEE Haptics Symp. 126–133 (IEEE, 2016)

  20. 20.

    Martinez, M. O. et al. Open source, modular, customizable, 3-D printed kinesthetic haptic devices. In 2017 IEEE World Haptics Conf. 142–147 (IEEE, 2017).

  21. 21.

    Ma, K. Y., Chirarattananon, P., Fuller, S. B. & Wood, R. J. Controlled flight of a biologically inspired, insect-scale robot. Science 340, 603–607 (2013).

  22. 22.

    Felton, S., Tolley, M., Demaine, E., Rus, D. & Wood, R. A method for building self-folding machines. Science 345, 644–646 (2014).

  23. 23.

    Paik, J. Soft robot design methodology for ‘push-button’ manufacturing. Nat. Rev. Mater. 3, 81–83 (2018).

  24. 24.

    Zhakypov, Z., Mori, K., Hosoda, K. & Paik, J. Designing minimal and scalable insect-inspired multi-locomotion millirobots. Nature 571, 381–386 (2019).

  25. 25.

    Rus, D. & Tolley, M. T. Design, fabrication and control of origami robots. Nat. Rev. Mater. 3, 101–112 (2018).

  26. 26.

    Mintchev, S., Daler, L., L’Eplattenier, G., Saint-Raymond, L. & Floreano, D. Foldable and self-deployable pocket sized quadrotor. In 2015 IEEE Int. Conf. on Robotics and Automation 2190–2195 (IEEE, 2015).

  27. 27.

    Mintchev, S., Shintake, J. & Floreano, D. Bioinspired dual-stiffness origami. Sci. Robot. 3, eaau0275 (2018).

  28. 28.

    Rey, L. & Clavel, R. in The Delta Parallel Robot BT—Parallel Kinematic Machines (eds Boër, C. R., Molinari-Tosatti, L. & Smith, K. S.) 401–417 (Springer, 1999).

  29. 29.

    McClintock, H., Temel, F. Z., Doshi, N., Koh, J. & Wood, R. J. The milliDelta: a high-bandwidth, high-precision, millimeter-scale Delta robot. Sci. Robot. 3, eaar3018 (2018).

  30. 30.

    Canfield, S. L. & Reinholtz, C. F. in Development of the Carpal Robotic Wrist BT—Experimental Robotics V (eds Casals, A. & de Almeida, A. T.) 423–434 (Springer, 1998).

  31. 31.

    Salerno, M., Zhang, K., Menciassi, A. & Dai, J. S. A novel 4-DOF origami grasper with an SMA-actuation system for minimally invasive surgery. IEEE Trans. Robot. 32, 484–498 (2016).

  32. 32.

    Salerno, M., Firouzeh, A. & Paik, J. A low profile electromagnetic actuator design and model for an origami parallel platform. J. Mech. Robot. 9, 41005–41011 (2017).

  33. 33.

    Salerno, M., Zuliani, F., Firouzeh, A. & Paik, J. Design and control of a low profile electromagnetic actuator for foldable pop-up mechanisms. Sens. Actuat. A 265, 70–78 (2017).

  34. 34.

    Shams, L. & Seitz, A. R. Benefits of multisensory learning. Trends Cogn. Sci. 12, 411–417 (2008).

  35. 35.

    Bouzit, M., Popescu, G., Burdea, G. & Boian, R. The Rutgers Master II-ND force feedback glove. In Proc. 10th Symp. on Haptic Interfaces for Virtual Environment and Teleoperator Systems 145–152 (IEEE, 2002).

  36. 36.

    Choi, I., Hawkes, E. W., Christensen, D. L., Ploch, C. J. & Follmer, S. Wolverine: a wearable haptic interface for grasping in virtual reality. In 2016 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems 986–993 (IEEE/RSJ, 2016).

  37. 37.

    Benko, H., Holz, C., Sinclair, M. & Ofek, E. NormalTouch and TextureTouch: high-fidelity 3D haptic shape rendering on handheld virtual reality controllers. In Proc. 29th Ann. Symp. on User Interface Software and Technology 717–728 (ACM, 2016).

  38. 38.

    Choi, I., Ofek, E., Benko, H., Sinclair, M. & Holz, C. CLAW: a multifunctional handheld haptic controller for grasping, touching, and triggering in virtual reality. In Proc. 2018 CHI Conf. on Human Factors in Computing Systems (ACM, 2018).

  39. 39.

    Moehring, M. & Froehlich, B. Effective manipulation of virtual objects within arm’s reach. In 2011 IEEE Virtual Reality Conf. 131–138 (IEEE, 2011).

  40. 40.

    Hinchet, R., Vechev, V., Shea, H. & Hilliges, O. Dextres: wearable haptic feedback for grasping in VR via a thin form-factor electrostatic brake. In Proc. 31st Ann. ACM Symp. on User Interface Software and Technology (ACM, 2018).

  41. 41.

    Rognon, C. et al. FlyJacket: an upper body soft exoskeleton for immersive drone control. IEEE Robot. Autom. Lett. 3, 2362–2369 (2018).

  42. 42.

    Miehlbradt, J. et al. Data-driven body–machine interface for the accurate control of drones. Proc. Natl Acad. Sci. 115, 7913–7918 (2018).

  43. 43.

    Lam, T. M., Boschloo, H. W., Mulder, M. & Paassen, M. Mvan Artificial force field for haptic feedback in UAV teleoperation. IEEE Trans. Syst. Man. Cybern. A 39, 1316–1330 (2009).

  44. 44.

    Hou, X., Mahony, R. & Schill, F. Comparative study of haptic interfaces for bilateral teleoperation of VTOL aerial robots. IEEE Trans. Syst. Man. Cybern. A 46, 1352–1363 (2016).

  45. 45.

    Crespo, L. & Reinkensmeyer, D. Haptic guidance can enhance motor learning of a steering task. J. Mot. Behav. 40, 545–557 (2008).

  46. 46.

    Seifi, H. et al. Haptipedia: accelerating haptic device discovery to support interaction and engineering design. In ACM Proc. Conf. on Human Factors in Computing Systems 558 (ACM, 2019).

Download references


We thank F. Conti for the suggestions and feedback on Foldaway applications. This study was supported by the SNSF through the National Center of Competence in Research Robotics, and through the Swiss National Science Foundation and Innosuisse Bridge grant number 20B1-1_181840.

Author information

S.M. and M.S. developed the concept. S.M., M.S. and S.S. fabricated the experimental samples of the interfaces. M.S., A.C. and S.S. developed the electronics and the low- and high-level control software for the interfaces. S.M. and M.S. designed the experiments. M.S., A.C. and S.S. performed the characterization of the interfaces. A.C. and S.M. collected the user data. S.M., M.S. and J.P. wrote the manuscript.

Correspondence to Stefano Mintchev.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary notes, figures and references.

Reporting Summary

Supplementary Video 1

A portable 3-DoF force feedback origami robot for human-robot interactions.

Supplementary Video 2

Haptic interface for exploration of images with haptic content.

Supplementary Video 3

Pushbutton for interaction with objects in VR.

Supplementary Video 4

A twin Pushbutton controller with force feedback for drone piloting.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Mintchev, S., Salerno, M., Cherpillod, A. et al. A portable three-degrees-of-freedom force feedback origami robot for human–robot interactions. Nat Mach Intell 1, 584–593 (2019). https://doi.org/10.1038/s42256-019-0125-1

Download citation