Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Learning human–environment interactions using conformal tactile textiles

A Publisher Correction to this article was published on 01 April 2021

This article has been updated


Recording, modelling and understanding tactile interactions is important in the study of human behaviour and in the development of applications in healthcare and robotics. However, such studies remain challenging because existing wearable sensory interfaces are limited in terms of performance, flexibility, scalability and cost. Here, we report a textile-based tactile learning platform that can be used to record, monitor and learn human–environment interactions. The tactile textiles are created via digital machine knitting of inexpensive piezoresistive fibres, and can conform to arbitrary three-dimensional geometries. To ensure that our system is robust against variations in individual sensors, we use machine learning techniques for sensing correction and calibration. Using the platform, we capture diverse human–environment interactions (more than a million tactile frames) and show that the artificial-intelligence-powered sensing textiles can classify humans’ sitting poses, motions and other interactions with the environment. We also show that the platform can recover dynamic whole-body poses, reveal environmental spatial information and discover biomechanical signatures.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Textile-based tactile learning platform.
Fig. 2: Characterization of the functional fibre.
Fig. 3: Self-supervised correction.
Fig. 4: Learning on human–environment interactions.

Similar content being viewed by others

Data availability

Data that support the findings of this study are available from the corresponding authors upon reasonable request. Source data are provided with this paper.

Code availability

The code used to generate the plots within this paper is available from the corresponding authors upon reasonable request.

Change history


  1. Dahiya, R. S., Metta, G., Valle, M. & Sandini, G. Tactile sensing—from humans to humanoids. IEEE Trans. Robot. 26, 1–20 (2009).

    Google Scholar 

  2. Winter, D. A. Human balance and posture control during standing and walking. Gait Posture 3, 193–214 (1995).

    Article  Google Scholar 

  3. Someya, T., Bao, Z. & Malliaras, G. G. The rise of plastic bioelectronics. Nature 540, 379–385 (2016).

    Article  Google Scholar 

  4. Yousef, H., Boukallel, M. & Althoefer, K. Tactile sensing for dexterous in-hand manipulation in robotics—a review. Sens. Actuator A Phys. 167, 171–187 (2011).

    Article  Google Scholar 

  5. Yang, G. Z. et al. The grand challenges of Science Robotics. Sci. Robot 3, eaar7650 (2018).

    Article  Google Scholar 

  6. Sundaram, S., Kellnhofer, P., Zhu, J. Y., Torralba, A. & Matusik, W. Learning the signatures of the human grasp using a scalable tactile glove. Nature 569, 698–702 (2019).

    Article  Google Scholar 

  7. Poupyrev, I. et al. Project Jacquard: interactive digital textiles at scale. In Proc. 2016 CHI Conference on Human Factors in Computing Systems (CHI ‘16) 4216–4227 (ACM, 2016).

  8. Zeng, W. et al. Fiber‐based wearable electronics: a review of materials, fabrication, devices and applications. Adv. Mater. 26, 5310–5336 (2014).

    Article  Google Scholar 

  9. Yan, W. et al. Advanced multimaterial electronic and optoelectronic fibers and textiles. Adv. Mater. 31, 1802348 (2019).

    Article  Google Scholar 

  10. Rogers, J. A., Someya, T. & Huang, Y. Materials and mechanics for stretchable electronics. Science 327, 1603–1607 (2010).

    Article  Google Scholar 

  11. Engler, A. J., Sen, S., Sweeney, H. L. & Discher, D. E. Matrix elasticity directs stem cell lineage specification. Cell 1261, 677–689 (2006).

    Article  Google Scholar 

  12. Moeslund, T. B., Hilton, A. & Kruger, V. A survey of advances in vision-based human motion capture and analysis. Comput. Vis. Image Underst. 104, 90–126 (2006).

    Article  Google Scholar 

  13. Rein, M. et al. Diode fibres for fabric-based optical communications. Nature 560, 214–218 (2018).

    Article  Google Scholar 

  14. Kim, D. H. et al. Stretchable and foldable silicon integrated circuits. Science 320, 507–511 (2008).

    Article  Google Scholar 

  15. Wang, S., Oh, J. Y., Xu, J., Tran, H. & Bao, Z. Skin-inspired electronics: an emerging paradigm. Acc. Chem. Res. 51, 1033–1045 (2018).

    Article  Google Scholar 

  16. Xenoma;

  17. Ahn, B. Y. et al. Omnidirectional printing of flexible, stretchable and spanning silver microelectrodes. Science 323, 1590–1593 (2009).

    Article  Google Scholar 

  18. Truby, R. L. & Lewis, J. A. Printing soft matter in three dimensions. Nature 540, 371–378 (2016).

    Article  Google Scholar 

  19. Adams, J. A. A closed-loop theory of motor learning. J. Mot. Behav. 3, 111–150 (1971).

    Article  Google Scholar 

  20. Wang, Z., Yang, Z. & Dong, T. A review of wearable technologies for elderly care that can accurately track indoor position, recognize physical activities and monitor vital signs in real time. Sensors 17, 341 (2017).

    Article  Google Scholar 

  21. Narayanan, V., Wu, K., Yuksel, C. & McCann, J. Visual knitting machine programming. ACM Trans. Graph 38, 63 (2019).

    Article  Google Scholar 

  22. Shima Seiki. Sds-one apex3 (2011).

  23. Briseno, A. L. et al. Patterning organic single-crystal transistor arrays. Nature 444, 913–917 (2006).

    Article  Google Scholar 

  24. Khang, D. Y., Jiang, H., Huang, Y. & Rogers, J. A. A stretchable form of single-crystal silicon for high-performance electronics on rubber substrates. Science 311, 208–212 (2006).

    Article  Google Scholar 

  25. Ulyanov, D., Vedaldi, A. & Lempitsky, V. Deep image prior. In Proc. IEEE Conference on Computer Vision and Pattern Recognition (CVPR ’2018) 9446–9454 (IEEE, 2018).

  26. D’Alessio, T. Measurement errors in the scanning of piezoresistive sensors arrays. Sens. Actuator A Phys. 72, 71–76 (1999).

    Article  Google Scholar 

  27. Maaten, L. V. D. & Hinton, G. Visualizing data using t-SNE. J. Mach. Learn. Res. 9, 2579–2605 (2008).

    MATH  Google Scholar 

  28. Lederman, S. J. & Klatzky, R. L. Haptic perception: a tutorial. Atten. Percept. Psychophys. 71, 1439–1459 (2009).

    Article  Google Scholar 

  29. Bauby, C. E. & Kuo, A. D. Active control of lateral balance in human walking. J. Biomech. 33, 1433–1440 (2000).

    Article  Google Scholar 

  30. Scott, J. et al. From kinematics to dynamics: estimating center of pressure and base of support from video frames of human motion. Preprint at (2020).

  31. Okamura, A. M., Smaby, N. & Cutkosky, M. R. An overview of dexterous manipulation. In Proc. 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings Vol. 1, 255–262 (IEEE, 2000).

  32. Cheng, G. et al. A comprehensive realization of robot skin: sensors, sensing, control and applications. Proc. IEEE 107, 2034–2051 (2019).

    Article  Google Scholar 

  33. Cho, K. et al. Learning phrase representations using RNN encoder–decoder for statistical machine translation. Preprint at (2014).

Download references


This work is supported by the Toyota Research Institute. We thank L. Makatura, P. Kellnhofer, A. Kaspar and S. Sundaram for their helpful suggestions for this work. We also thank D. Rus for the use of the mechanical tester and J. L. McCann for providing us with the necessary code to programmatically work with our industrial knitting machine and visualize the knitting structure.

Author information

Authors and Affiliations



Y. Luo, W.S. and M.F. developed and implemented the functional fibre fabrication set-up. Y. Luo and W.S. conceived and implemented the sensor design, and performed characterizations. Y. Luo and K.W. designed and fabricated full-body sensing textiles. Y. Luo, W.S., Y. Li, P.S., K.W. and B.L. conducted data collection. Y. Li conducted the self-supervised sensing correction. P.S. conducted the experiments on 3D pose prediction from the tactile footprints. Y. Li and P.S. implemented the classification framework. T.P., A.T. and W.M. supervised the work. All authors contributed to the study concept, conceived of the experimental methods, discussed and generated the results, and prepared the manuscript.

Corresponding authors

Correspondence to Yunzhu Li, Wan Shou, Antonio Torralba or Wojciech Matusik.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Peer review information Nature Electronics thanks Sihong Wang, Jun Chen and Lining Yao for their contribution to the peer review of this work.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Supplementary Discussion, Tables 1 and 2 and Figs. 1–13.

Reporting Summary

Supplementary Video 1

Demonstration of sensor flexibility, stretchability and bendability.

Supplementary Video 2

Illustration of manual inlay and automatic inlay methods.

Supplementary Video 3

Data collection on a tactile glove for calibration and object classification.

Supplementary Video 4

Data collection on a tactile sock for calibration, action classification and full-body pose prediction.

Supplementary Video 5

Motion prediction result.

Supplementary Video 6

Data collection on a tactile vest for calibration and demonstration on applications.

Supplementary Video 7

Data collection on a tactile robot arm (KUKA) sleeve for calibration and demonstration on applications.

Supplementary Video 8

Real-time classification of human pose with a tactile vest.

Source data

Source Data Fig. 2

Statistical source data.

Source Data Fig. 4

Statistical source data.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Luo, Y., Li, Y., Sharma, P. et al. Learning human–environment interactions using conformal tactile textiles. Nat Electron 4, 193–201 (2021).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:

This article is cited by


Quick links

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics