Sign-to-speech translation using machine-learning-assisted stretchable sensor arrays

Abstract

Signed languages are not as pervasive a conversational medium as spoken languages due to the history of institutional suppression of the former and the linguistic hegemony of the latter. This has led to a communication barrier between signers and non-signers that could be mitigated by technology-mediated approaches. Here, we show that a wearable sign-to-speech translation system, assisted by machine learning, can accurately translate the hand gestures of American Sign Language into speech. The wearable sign-to-speech translation system is composed of yarn-based stretchable sensor arrays and a wireless printed circuit board, and offers a high sensitivity and fast response time, allowing real-time translation of signs into spoken words to be performed. By analysing 660 acquired sign language hand gesture recognition patterns, we demonstrate a recognition rate of up to 98.63% and a recognition time of less than 1 s.

Access options

Rent or Buy article

Get time limited or full article access on ReadCube.

from$8.99

All prices are NET prices.

Fig. 1: Schematic illustrations of the wearable sign-to-speech translation system.
Fig. 2: Working mechanism and characterization of the stretchable sensing unit.
Fig. 3: Real-time sign language components acquisition.
Fig. 4: Demonstration of the wearable sign-to-speech translation.

Data availability

The data that support the plots within this paper and other findings of the study are available from the corresponding authors upon reasonable request.

Code availability

The code is available from the corresponding authors upon reasonable request.

References

  1. 1.

    Hill, J. C., Lillo-Martin, D. C. & Wood, S. K. Sign Languages: Structures and Contexts 1st edn, Ch. 1 (Routledge, 2019).

  2. 2.

    Lane, H. Mask of Benevolence: Disabling the Deaf Community (Knopf, 1992).

  3. 3.

    Gao, W. et al. Fully integrated wearable sensor arrays for multiplexed in situ perspiration analysis. Nature 529, 509–514 (2016).

    Google Scholar 

  4. 4.

    Huang, Z. et al. Three-dimensional integrated stretchable electronics. Nat. Electron. 1, 473–480 (2018).

    Google Scholar 

  5. 5.

    Jinno, H. et al. Stretchable and waterproof elastomer-coated organic photovoltaics for washable electronic textile applications. Nat. Energy 2, 780–785 (2017).

    Google Scholar 

  6. 6.

    Kaltenbrunner, M. et al. An ultra-lightweight design for imperceptible plastic electronics. Nature 499, 458–463 (2013).

    Google Scholar 

  7. 7.

    Kang, S. et al. Transparent and conductive nanomembranes with orthogonal silver nanowire arrays for skin-attachable loudspeakers and microphones. Sci. Adv. 4, eaas8772 (2018).

    Google Scholar 

  8. 8.

    Kim, D. H. et al. Epidermal electronics. Science 333, 838–843 (2011).

    Google Scholar 

  9. 9.

    Miyamoto, A. et al. Inflammation-free, gas-permeable, lightweight, stretchable on-skin electronics with nanomeshes. Nat. Nanotechnol. 12, 907–913 (2017).

    Google Scholar 

  10. 10.

    McAlpine, M. C., Ahmad, H., Wang, D. & Heath, J. R. Highly ordered nanowire arrays on plastic substrates for ultrasensitive flexible chemical sensors. Nat. Mater. 6, 379–384 (2007).

    Google Scholar 

  11. 11.

    Wang, C. et al. User-interactive electronic skin for instantaneous pressure visualization. Nat. Mater. 12, 899–904 (2013).

    Google Scholar 

  12. 12.

    Wang, C. et al. Monitoring of the central blood pressure waveform via a conformal ultrasonic device. Nat. Biomed. Eng. 2, 687–695 (2018).

    Google Scholar 

  13. 13.

    Wang, S. et al. Skin electronics from scalable fabrication of an intrinsically stretchable transistor array. Nature 555, 83–88 (2018).

    Google Scholar 

  14. 14.

    Wu, W., Wen, X. & Wang, Z. L. Taxel-addressable matrix of vertical-nanowire piezotronic transistors for active and adaptive tactile imaging. Science 340, 952–957 (2013).

    Google Scholar 

  15. 15.

    Zhang, N. et al. Photo-rechargeable fabrics as sustainable and robust power sources for wearable bioelectronics. Matter 2, 1260–1269 (2020).

    Google Scholar 

  16. 16.

    Xu, S. et al. Soft microfluidic assemblies of sensors, circuits and radios for the skin. Science 344, 70–74 (2014).

    Google Scholar 

  17. 17.

    Yamada, T. et al. A stretchable carbon nanotube strain sensor for human-motion detection. Nat. Nanotechnol. 6, 296–301 (2011).

    Google Scholar 

  18. 18.

    Kosmidou, V. E. & Hadjileontiadis, L. J. Sign language recognition using intrinsic-mode sample entropy on sEMG and accelerometer data. IEEE Trans. Biomed. Eng. 56, 2879–2890 (2009).

    Google Scholar 

  19. 19.

    Savur, C. & Sahin, F. Real-time American sign language recognition system by using surface EMG signal. In 2015 IEEE Int. Conf. Mach. Learn. Appl. 497–502 (IEEE, 2015).

  20. 20.

    Wu, J., Sun, L. & Jafari, R. A wearable system for recognizing American sign language in real-time using IMU and surface EMG sensors. IEEE J. Biomed. Health 20, 1281–1290 (2016).

    Google Scholar 

  21. 21.

    O’Connor, T. F. et al. The language of glove: wireless gesture decoder with low-power and stretchable hybrid electronics. PLoS ONE 12, e0179766 (2017).

    Google Scholar 

  22. 22.

    Lee, B. G. & Lee, S. M. Smart wearable hand device for sign language interpretation system with sensors fusion. IEEE Sens. J. 18, 1224–1232 (2018).

    Google Scholar 

  23. 23.

    Pradhan, G., Prabhakaran, B. & Li, C. J. Hand-gesture computing for the hearing and speech impaired. IEEE Multimed. 15, 20–27 (2008).

    Google Scholar 

  24. 24.

    Ambar, R. et al. Development of a wearable device for sign language recognition. J. Phys. Conf. Ser. 1019, 012017 (2018).

    Google Scholar 

  25. 25.

    Zhao, J. et al. Passive and space-discriminative ionic sensors based on durable nanocomposite electrodes toward sign language recognition. ACS Nano 11, 8590–8599 (2017).

    Google Scholar 

  26. 26.

    Abhishek, K. S., Qubeley, L. C. F. & Ho, D. Glove-based hand gesture recognition sign language translator using capacitive touch sensor. In 2016 IEEE Int. Conf. Electron Devices and Solid-State Circuits 334–337 (IEEE, 2016).

  27. 27.

    Ameen, S. & Vadera, S. A convolutional neural network to classify American Sign Language fingerspelling from depth and colour images. Expert Syst. 34, e12197 (2017).

    Google Scholar 

  28. 28.

    Munib, Q., Habeeb, M., Takruri, B. & Al-Malik, H. A. American sign language (ASL) recognition based on Hough transform and neural networks. Expert Syst. Appl. 32, 24–37 (2007).

    Google Scholar 

  29. 29.

    Rivera-Acosta, M., Ortega-Cisneros, S., Rivera, J. & Sandoval-Ibarra, F. American sign language alphabet recognition using a neuromorphic sensor and an artificial neural network. Sensors 17, 2176 (2017).

    Google Scholar 

  30. 30.

    Starner, T., Weaver, J. & Pentland, A. Real-time American sign language recognition using desk and wearable computer based video. IEEE Trans. Pattern Anal. 20, 1371–1375 (1998).

    Google Scholar 

  31. 31.

    Meng, K. et al. A wireless textile-based sensor system for self-powered personalized health care. Matter 2, 896–907 (2020).

    Google Scholar 

  32. 32.

    Fan, F. R., Tian, Z. Q. & Wang, Z. L. Flexible triboelectric generator! Nano Energy 1, 328–334 (2012).

    Google Scholar 

  33. 33.

    Chen, J. & Wang, Z. L. Reviving vibration energy harvesting and self-powered sensing by a triboelectric nanogenerator. Joule 1, 480–521 (2017).

    Google Scholar 

  34. 34.

    Yan, C. et al. A linear-to-rotary hybrid nanogenerator for high-performance wearable biomechanical energy harvesting. Nano Energy 67, 104235 (2020).

    Google Scholar 

  35. 35.

    Chen, J. et al. Micro-cable structured textile for simultaneously harvesting solar and mechanical energy. Nat. Energy 1, 16138 (2016).

    Google Scholar 

  36. 36.

    Zhou, Z. et al. Single-layered ultra-soft washable smart textiles for all-around ballistocardiograph, respiration and posture monitoring during sleep. Biosens. Bioelectron. 155, 112064 (2020).

    Google Scholar 

  37. 37.

    Wang, Z. L., Chen, J. & Lin, L. Progress in triboelectric nanogenerators as a new energy technology and self-powered sensors. Energy Environ. Sci. 8, 2250–2282 (2015).

    Google Scholar 

  38. 38.

    Chen, G. et al. Smart textiles for electricity generation. Chem. Rev. 120, 3668–3720 (2020).

    Google Scholar 

  39. 39.

    Amjadi, M., Pichitpajongkit, A., Lee, S., Ryu, S. & Park, I. Highly stretchable and sensitive strain sensor based on silver nanowire-elastomer nanocomposite. ACS Nano 8, 5154–5163 (2014).

    Google Scholar 

  40. 40.

    Xiao, X. et al. High-strain sensors based on ZnO nanowire/polystyrene hybridized flexible films. Adv. Mater. 23, 5440–5444 (2011).

    Google Scholar 

  41. 41.

    Boutry, C. M. et al. A stretchable and biodegradable strain and pressure sensor for orthopaedic application. Nat. Electron. 1, 314–321 (2018).

    Google Scholar 

  42. 42.

    Lipomi, D. J. et al. Skin-like pressure and strain sensors based on transparent elastic films of carbon nanotubes. Nat. Nanotechnol. 6, 788–792 (2011).

    Google Scholar 

  43. 43.

    Yu, X. et al. Skin-integrated wireless haptic interfaces for virtual and augmented reality. Nature 575, 473–479 (2019).

    Google Scholar 

  44. 44.

    Chung, H. U. et al. Binodal, wireless epidermal electronic systems with in-sensor analytics for neonatal intensive care. Science 363, eaau0780 (2019).

    Google Scholar 

  45. 45.

    Sim, K. et al. Fully rubbery integrated electronics from high effective mobility intrinsically stretchable semiconductors. Sci. Adv. 5, eaav5749 (2019).

    Google Scholar 

Download references

Acknowledgements

J.Y. acknowledges support from the National Natural Science Foundation of China (no. 51675069), the Fundamental Research Funds for the Central Universities (nos. 2018CDQYGD0020 and cqu2018CDHB1A05), the Scientific and Technological Research Program of Chongqing Municipal Education Commission (KJ1703047) and the Natural Science Foundation Projects of Chongqing (cstc2017shmsA40018 and cstc2018jcyjAX0076). J.C. also acknowledges the Henry Samueli School of Engineering & Applied Science and the Department of Bioengineering at the University of California, Los Angeles for start-up support. We also thank B. Lewis in the Department of Linguistics at the University of California, Los Angeles for his inspiring discussion on avoiding using a deficit-based framework in the description of deaf people and sign language in our Article.

Author information

Affiliations

Authors

Contributions

J.C. and J.Y. planned the study and supervised the whole project. J.Y., J.C. and Z.Z. conceived the idea, designed the experiment, analysed the data and composed the manuscript. Z.Z., K.C., X.L., S.Z., Y.W., Y.Z., K.M., C.S., Q.H., W.F., E.F., Z.L., X.T. and W.D. performed all of the experiments and made technical comments on the manuscript. J.C. submitted the manuscript and was the lead contact.

Corresponding authors

Correspondence to Jin Yang or Jun Chen.

Ethics declarations

Competing interests

J.C., J.Y. and Z.Z. have filed a patent based on this work under the US provisional patent application no. 62/967,509.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Supplementary Figs. 1–21, Note 1 and Tables 1–3.

Supplementary Video 1

Wearable YSSA for Sign Language gestures acquisition.

Supplementary Video 2

Wearable YSSA for Sign Language translation.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Zhou, Z., Chen, K., Li, X. et al. Sign-to-speech translation using machine-learning-assisted stretchable sensor arrays. Nat Electron 3, 571–578 (2020). https://doi.org/10.1038/s41928-020-0428-6

Download citation

Further reading

Search

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing