Abstract
Recording, modelling and understanding tactile interactions is important in the study of human behaviour and in the development of applications in healthcare and robotics. However, such studies remain challenging because existing wearable sensory interfaces are limited in terms of performance, flexibility, scalability and cost. Here, we report a textile-based tactile learning platform that can be used to record, monitor and learn human–environment interactions. The tactile textiles are created via digital machine knitting of inexpensive piezoresistive fibres, and can conform to arbitrary three-dimensional geometries. To ensure that our system is robust against variations in individual sensors, we use machine learning techniques for sensing correction and calibration. Using the platform, we capture diverse human–environment interactions (more than a million tactile frames) and show that the artificial-intelligence-powered sensing textiles can classify humans’ sitting poses, motions and other interactions with the environment. We also show that the platform can recover dynamic whole-body poses, reveal environmental spatial information and discover biomechanical signatures.
This is a preview of subscription content, access via your institution
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$29.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 digital issues and online access to articles
$119.00 per year
only $9.92 per issue
Buy this article
- Purchase on Springer Link
- Instant access to full article PDF
Prices may be subject to local taxes which are calculated during checkout
Similar content being viewed by others
Data availability
Data that support the findings of this study are available from the corresponding authors upon reasonable request. Source data are provided with this paper.
Code availability
The code used to generate the plots within this paper is available from the corresponding authors upon reasonable request.
Change history
01 April 2021
A Correction to this paper has been published: https://doi.org/10.1038/s41928-021-00572-2
References
Dahiya, R. S., Metta, G., Valle, M. & Sandini, G. Tactile sensing—from humans to humanoids. IEEE Trans. Robot. 26, 1–20 (2009).
Winter, D. A. Human balance and posture control during standing and walking. Gait Posture 3, 193–214 (1995).
Someya, T., Bao, Z. & Malliaras, G. G. The rise of plastic bioelectronics. Nature 540, 379–385 (2016).
Yousef, H., Boukallel, M. & Althoefer, K. Tactile sensing for dexterous in-hand manipulation in robotics—a review. Sens. Actuator A Phys. 167, 171–187 (2011).
Yang, G. Z. et al. The grand challenges of Science Robotics. Sci. Robot 3, eaar7650 (2018).
Sundaram, S., Kellnhofer, P., Zhu, J. Y., Torralba, A. & Matusik, W. Learning the signatures of the human grasp using a scalable tactile glove. Nature 569, 698–702 (2019).
Poupyrev, I. et al. Project Jacquard: interactive digital textiles at scale. In Proc. 2016 CHI Conference on Human Factors in Computing Systems (CHI ‘16) 4216–4227 (ACM, 2016).
Zeng, W. et al. Fiber‐based wearable electronics: a review of materials, fabrication, devices and applications. Adv. Mater. 26, 5310–5336 (2014).
Yan, W. et al. Advanced multimaterial electronic and optoelectronic fibers and textiles. Adv. Mater. 31, 1802348 (2019).
Rogers, J. A., Someya, T. & Huang, Y. Materials and mechanics for stretchable electronics. Science 327, 1603–1607 (2010).
Engler, A. J., Sen, S., Sweeney, H. L. & Discher, D. E. Matrix elasticity directs stem cell lineage specification. Cell 1261, 677–689 (2006).
Moeslund, T. B., Hilton, A. & Kruger, V. A survey of advances in vision-based human motion capture and analysis. Comput. Vis. Image Underst. 104, 90–126 (2006).
Rein, M. et al. Diode fibres for fabric-based optical communications. Nature 560, 214–218 (2018).
Kim, D. H. et al. Stretchable and foldable silicon integrated circuits. Science 320, 507–511 (2008).
Wang, S., Oh, J. Y., Xu, J., Tran, H. & Bao, Z. Skin-inspired electronics: an emerging paradigm. Acc. Chem. Res. 51, 1033–1045 (2018).
Xenoma; https://xenoma.com/#products/
Ahn, B. Y. et al. Omnidirectional printing of flexible, stretchable and spanning silver microelectrodes. Science 323, 1590–1593 (2009).
Truby, R. L. & Lewis, J. A. Printing soft matter in three dimensions. Nature 540, 371–378 (2016).
Adams, J. A. A closed-loop theory of motor learning. J. Mot. Behav. 3, 111–150 (1971).
Wang, Z., Yang, Z. & Dong, T. A review of wearable technologies for elderly care that can accurately track indoor position, recognize physical activities and monitor vital signs in real time. Sensors 17, 341 (2017).
Narayanan, V., Wu, K., Yuksel, C. & McCann, J. Visual knitting machine programming. ACM Trans. Graph 38, 63 (2019).
Shima Seiki. Sds-one apex3 http://www.shimaseiki.com/product/design/sdsone_apex/flat/ (2011).
Briseno, A. L. et al. Patterning organic single-crystal transistor arrays. Nature 444, 913–917 (2006).
Khang, D. Y., Jiang, H., Huang, Y. & Rogers, J. A. A stretchable form of single-crystal silicon for high-performance electronics on rubber substrates. Science 311, 208–212 (2006).
Ulyanov, D., Vedaldi, A. & Lempitsky, V. Deep image prior. In Proc. IEEE Conference on Computer Vision and Pattern Recognition (CVPR ’2018) 9446–9454 (IEEE, 2018).
D’Alessio, T. Measurement errors in the scanning of piezoresistive sensors arrays. Sens. Actuator A Phys. 72, 71–76 (1999).
Maaten, L. V. D. & Hinton, G. Visualizing data using t-SNE. J. Mach. Learn. Res. 9, 2579–2605 (2008).
Lederman, S. J. & Klatzky, R. L. Haptic perception: a tutorial. Atten. Percept. Psychophys. 71, 1439–1459 (2009).
Bauby, C. E. & Kuo, A. D. Active control of lateral balance in human walking. J. Biomech. 33, 1433–1440 (2000).
Scott, J. et al. From kinematics to dynamics: estimating center of pressure and base of support from video frames of human motion. Preprint at https://arxiv.org/abs/2001.00657 (2020).
Okamura, A. M., Smaby, N. & Cutkosky, M. R. An overview of dexterous manipulation. In Proc. 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings Vol. 1, 255–262 (IEEE, 2000).
Cheng, G. et al. A comprehensive realization of robot skin: sensors, sensing, control and applications. Proc. IEEE 107, 2034–2051 (2019).
Cho, K. et al. Learning phrase representations using RNN encoder–decoder for statistical machine translation. Preprint at https://arxiv.org/abs/1406.1078 (2014).
Acknowledgements
This work is supported by the Toyota Research Institute. We thank L. Makatura, P. Kellnhofer, A. Kaspar and S. Sundaram for their helpful suggestions for this work. We also thank D. Rus for the use of the mechanical tester and J. L. McCann for providing us with the necessary code to programmatically work with our industrial knitting machine and visualize the knitting structure.
Author information
Authors and Affiliations
Contributions
Y. Luo, W.S. and M.F. developed and implemented the functional fibre fabrication set-up. Y. Luo and W.S. conceived and implemented the sensor design, and performed characterizations. Y. Luo and K.W. designed and fabricated full-body sensing textiles. Y. Luo, W.S., Y. Li, P.S., K.W. and B.L. conducted data collection. Y. Li conducted the self-supervised sensing correction. P.S. conducted the experiments on 3D pose prediction from the tactile footprints. Y. Li and P.S. implemented the classification framework. T.P., A.T. and W.M. supervised the work. All authors contributed to the study concept, conceived of the experimental methods, discussed and generated the results, and prepared the manuscript.
Corresponding authors
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Peer review information Nature Electronics thanks Sihong Wang, Jun Chen and Lining Yao for their contribution to the peer review of this work.
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Supplementary Information
Supplementary Discussion, Tables 1 and 2 and Figs. 1–13.
Supplementary Video 1
Demonstration of sensor flexibility, stretchability and bendability.
Supplementary Video 2
Illustration of manual inlay and automatic inlay methods.
Supplementary Video 3
Data collection on a tactile glove for calibration and object classification.
Supplementary Video 4
Data collection on a tactile sock for calibration, action classification and full-body pose prediction.
Supplementary Video 5
Motion prediction result.
Supplementary Video 6
Data collection on a tactile vest for calibration and demonstration on applications.
Supplementary Video 7
Data collection on a tactile robot arm (KUKA) sleeve for calibration and demonstration on applications.
Supplementary Video 8
Real-time classification of human pose with a tactile vest.
Source data
Source Data Fig. 2
Statistical source data.
Source Data Fig. 4
Statistical source data.
Rights and permissions
About this article
Cite this article
Luo, Y., Li, Y., Sharma, P. et al. Learning human–environment interactions using conformal tactile textiles. Nat Electron 4, 193–201 (2021). https://doi.org/10.1038/s41928-021-00558-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1038/s41928-021-00558-0
This article is cited by
-
Adaptive tactile interaction transfer via digitally embroidered smart gloves
Nature Communications (2024)
-
Ultrafast readout, crosstalk suppression iontronic array enabled by frequency-coding architecture
npj Flexible Electronics (2024)
-
Well-defined in-textile photolithography towards permeable textile electronics
Nature Communications (2024)
-
Computational design of ultra-robust strain sensors for soft robot perception and autonomy
Nature Communications (2024)
-
Design and fabrication of wearable electronic textiles using twisted fiber-based threads
Nature Protocols (2024)