Abstract
Variation in human brains creates difficulty in implementing electroencephalography into universal brain–machine interfaces. Conventional electroencephalography systems typically suffer from motion artefacts, extensive preparation time and bulky equipment, while existing electroencephalography classification methods require training on a per-subject or per-session basis. Here, we introduce a fully portable, wireless, flexible scalp electronic system, incorporating a set of dry electrodes and a flexible membrane circuit. Time-domain analysis using convolutional neural networks allows for accurate, real-time classification of steady-state visually evoked potentials in the occipital lobe. Compared to commercial systems, the flexible electronics show the improved performance in detection of evoked potentials due to significant reduction of noise and electromagnetic interference. The two-channel scalp electronic system achieves a high information transfer rate (122.1 ± 3.53 bits per minute) with six human subjects, allowing for wireless, real-time, universal electroencephalography classification for an electric wheelchair, a motorized vehicle and a keyboard-less presentation.
This is a preview of subscription content, access via your institution
Relevant articles
Open Access articles citing this article.
-
Liquid-in-liquid printing of 3D and mechanically tunable conductive hydrogels
Nature Communications Open Access 18 July 2023
-
Conformal in-ear bioelectronics for visual and auditory brain-computer interfaces
Nature Communications Open Access 14 July 2023
-
Soft Electronics for Health Monitoring Assisted by Machine Learning
Nano-Micro Letters Open Access 15 March 2023
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$29.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 digital issues and online access to articles
$119.00 per year
only $9.92 per issue
Rent or buy this article
Prices vary by article type
from$1.95
to$39.95
Prices may be subject to local taxes which are calculated during checkout





Data availability
The EEG data recorded for this work are available on the Open Science Framework (https://osf.io/nyfa7/).
Code availability
The code for the CNN models are available on GitLab (https://gitlab.com/musasmahmood/ssvep-cnn-demo).
References
Garcia, J. O., Srinivasan, R. & Serences, J. T. Near-real-time feature-selective modulations in human cortex. Curr. Biol. 23, 515–522 (2013).
Zhu, D., Bieger, J., Garcia Molina, G. & Aarts, R. M. A survey of stimulation methods used in SSVEP-based BCIs. Comput. Intell. Neurosci. 2010, 702357 (2010).
Luo, A. & Sullivan, T. J. A user-friendly SSVEP-based brain-computer interface using a time-domain classifier. J. Neural Eng. 7, 26010 (2010).
Stamps, K. & Hamam, Y. Towards inexpensive BCI control for wheelchair navigation in the enabled environment - a hardware survey. Lect. Notes Artif. Int. 6334, 336–345 (2010).
Guger, C. et al. How many people could use an SSVEP BCI? Front. Neurosci. 6, 169 (2012).
Ramos-Murguialday, A. et al. Brain–machine interface in chronic stroke rehabilitation: a controlled study. Ann. Neurol. 74, 100–108 (2013).
Soekadar, S. R., Birbaumer, N., Slutzky, M. W. & Cohen, L. G. Brain–machine interfaces in neurorehabilitation of stroke. Neurobiol. Dis. 83, 172–179 (2015).
Nicolas-Alonso, L. F. & Gomez-Gil, J. Brain computer interfaces, a review. Sensors 12, 1211–1279 (2012).
Bin, G., Lin, Z., Gao, X., Hong, B. & Gao, S. The SSVEP topographic scalp maps by canonical correlation analysis. In Conf. Proc. IEEE Eng. Med. Biol. Soc. 3759–3762 (IEEE, 2008).
Wang, Y., Wang, R., Gao, X., Hong, B. & Gao, S. A practical VEP-based brain–computer interface. IEEE Trans. Neural Syst. Rehabil. Eng. 14, 234–239 (2006).
Wang, Y., Gao, X., Hong, B., Jia, C. & Gao, S. Brain–computer interfaces based on visual evoked potentials. IEEE Eng. Med. Biol. Mag. 27, 64–71 (2008).
Kelly, S. P., Lalor, E. C., Reilly, R. B. & Foxe, J. J. Visual spatial attention tracking using high-density SSVEP data for independent brain–computer communication. IEEE Trans. Neural Syst. Rehabil. Eng. 13, 172–178 (2005).
Herrmann, C. S. Human EEG responses to 1–100 Hz flicker: resonance phenomena in visual cortex and their potential correlation to cognitive phenomena. Exp. Brain Res. 137, 346–353 (2001).
Wang, Y., Wang, Y. T. & Jung, T. P. Visual stimulus design for high-rate SSVEP BCI. Electron. Lett. 46, 1057–1058 (2010).
Norton, J. J. S. et al. Soft, curved electrode systems capable of integration on the auricle as a persistent brain–computer interface. Proc. Natl Acad. Sci. USA 112, 3920–3925 (2015).
Beverina, F., Palmas, G., Silvoni, S., Piccione, F. & Giove, S. User adaptive BCIs: SSVEP and P300 based interfaces. PsychNology J. 1, 331–354 (2003).
Kronegg, J., Voloshynovskiy, S. & Pun, T. Analysis of bit-rate definitions for brain–computer interfaces. In HCI ‘05: Proc. 2005 International Conference on Human–Computer Interaction 40–46 (HCI International, 2005).
Lin, Z., Zhang, C., Wu, W. & Gao, X. Frequency recognition based on canonical correlation analysis for SSVEP-based BCIs. IEEE Trans. Biomed. Eng. 54, 1172–1176 (2007).
Martinez, P., Bakardjian, H. & Cichocki, A. Fully online multicommand brain–computer interface with visual neurofeedback using SSVEP paradigm. Comput. Intell Neurosci. 2007, 94561 (2007).
Bin, G., Gao, X., Yan, Z., Hong, B. & Gao, S. An online multi-channel SSVEP-based brain–computer interface using a canonical correlation analysis method. J. Neural Eng. 6, 046002 (2009).
Wang, Y. T., Wang, Y., Cheng, C. K. & Jung, T. P. Measuring steady-state visual evoked potentials from non-hair-bearing areas. In Conf. Proc. IEEE Eng. Med. Biol. Soc. 1806–1809 (IEEE, 2012).
Cecotti, H. A time–frequency convolutional neural network for the offline classification of steady-state visual evoked potential responses. Pattern Recognit. Lett. 32, 1145–1153 (2011).
McAdams, E. T., Jossinet, J., Lackermeier, A. & Risacher, F. Factors affecting electrode–gel–skin interface impedance in electrical impedance tomography. Med. Biol. Eng. Comput. 34, 397–408 (1996).
Searle, A. & Kirkup, L. A direct comparison of wet, dry and insulating bioelectric recording electrodes. Physiol. Meas. 21, 271–283 (2000).
Li, G., Wang, S. & Duan, Y. Y. Towards gel-free electrodes: a systematic study of electrode–skin impedance. Sens. Actuators B 241, 1244–1255 (2017).
Lin, C. T. et al. Review of wireless and wearable electroencephalogram systems and brain-computer interfaces - a mini-review. Gerontology 56, 112–119 (2010).
Salvo, P. et al. A 3D printed dry electrode for ECG/EEG recording. Sens. Actuators A 174, 96–102 (2012).
Stauffer, F. et al. Skin conformal polymer electrodes for clinical ECG and EEG recordings. Adv. Healthc. Mater. 7, e1700994 (2018).
Tallgren, P., Vanhatalo, S., Kaila, K. & Voipio, J. Evaluation of commercially available electrodes and gels for recording of slow EEG potentials. Clin. Neurophysiol. 116, 799–806 (2005).
Nakanishi, M. et al. Enhancing detection of SSVEPs for a high-speed brain speller using task-related component analysis. IEEE Trans. Biomed. Eng. 65, 104–112 (2018).
Kwak, N. S., Muller, K. R. & Lee, S. W. A convolutional neural network for steady state visual evoked potential classification under ambulatory environment. PLoS ONE 12, e0172578 (2017).
Dufort y Alvarez, G. et al. Wireless EEG system achieving high throughput and reduced energy consumption through lossless and near-lossless compression. IEEE Trans. Biomed. Circuits Syst. 12, 231–241 (2018).
Lin, C.-T., Chiu, C.-Y., Singh, A. K., King, J.-T. & Wang, Y.-K. A wireless multifunctional SSVEP-based brain computer interface assistive system. IEEE Trans. Cogn. Dev. Syst. 1 (IEEE, 2018).
Schlogl, A., Keinrath, C., Scherer, R. & Furtscheller, P. Information transfer of an EEG-based brain computer interface. In Proc. First International IEEE EMBS Conference on Neural Engineering 641–644 (IEEE, 2003).
Xu, J. & Zhong, B. Review on portable EEG technology in educational research. Comput. Hum. Behav. 81, 340–349 (2018).
Shi, M. et al. Towards portable SSVEP-based brain-computer interface using Emotiv EPOC and mobile phone. In Proc. Tenth International Conference on Advanced Computational Intelligence 249–253 (IEEE, 2018).
Chen, X., Zhao, B., Wang, Y., Xu, S. & Gao, X. Control of a 7-DOF robotic arm system with an SSVEP-based BCI. Int. J. Neural Syst. 28, 1850018 (2018).
Mishra, S. et al. Soft, conformal bioelectronics for a wireless human–wheelchair interface. Biosens. Bioelectron. 91, 796–803 (2017).
Lee, Y. et al. Wireless, intraoral hybrid electronics for real-time quantification of sodium intake toward hypertension management. Proc. Natl Acad. Sci. USA 115, 5377–5382 (2018).
Chi, Y. M. et al. A practical mobile dry EEG system for human computer interfaces. In Proc. International Conference on Augmented Cognition 649–655 (Springer, 2013).
Mullen, T. R. et al. Real-time neuroimaging and cognitive monitoring using wearable dry EEG. IEEE Trans. Biomed. Eng. 62, 2553–2567 (2015).
Lee, Y. et al. Soft electronics enabled ergonomic human–computer interaction for swallowing training. Sci. Rep. 7, 46697 (2017).
Krizhevsky, A., Sutskever, I. & Hinton, G. E. Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 1, 1097–1105 (2012).
Chen, X. et al. High-speed spelling with a noninvasive brain–computer interface. Proc. Natl Acad. Sci. USA 112, E6058–E6067 (2015).
Ji, Y., Hwang, J. & Kim, E. Y. An intelligent wheelchair using situation awareness and obstacle detection. Procedia 97, 620–628 (2013).
Ma, T. et al. The hybrid BCI system for movement control by combining motor imagery and moving onset visual evoked potential. J. Neural Eng. 14, 026015 (2017).
Pfurtscheller, G. & Neuper, C. Motor imagery and direct brain–computer communication. Proc. IEEE 89, 1123–1134 (2001).
Delorme, A. & Makeig, S. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J. Neurosci. Methods 134, 9–21 (2004).
Kim, Y. S. et al. Scalable manufacturing of solderable and stretchable physiologic sensing systems. Adv. Mater. 29, 1701312 (2017).
Bideaux, A., Zimmermann, B., Hey, S. & Stork, W. Synchronization in wireless biomedical-sensor networks with Bluetooth Low Energy. Curr. Dir. Biomed. Eng. 1, 73–76 (2015).
Mastinu, E., Ortiz-Catalan, M. & Hakansson, B. Analog front-ends comparison in the way of a portable, low-power and low-cost EMG controller based on pattern recognition. In Conf. Proc. IEEE Eng. Med. Biol. Soc. 2111–2114 (IEEE, 2015).
Zhang, Y., Zhou, G., Jin, J., Wang, X. & Cichocki, A. Frequency recognition in SSVEP-based BCI using multiset canonical correlation analysis. Int. J. Neural Syst. 24, 1450013 (2014).
Xu, R. et al. Fabric-based stretchable electronics with mechanically optimized designs and prestrained composite substrates. Extreme Mech. Lett. 1, 120–126 (2014).
Obermaier, B., Neuper, C., Guger, C. & Pfurtscheller, G. Information transfer rate in a five-classes brain–computer interface. IEEE Trans. Neural Syst. Rehabil. Eng. 9, 283–288 (2001).
Shannon, C. E. & Weaver, W. The Mathematical Theory of Information (Univ. Illinois Press, 1949).
Goldman, R. I., Stern, J. M., Engel, J.Jr & Cohen, M. S. Simultaneous EEG and fMRI of the alpha rhythm. Neuroreport 13, 2487–2492 (2002).
Wang, Y., Wang, R., Gao, X. & Gao, S. Brain-computer interface based on the high-frequency steady-state visual evoked potential. In Proc. First International Conference on Neural Interface and Control 37–39 (IEEE, 2005).
Kingma, D. P. & Ba, J. Adam: a method for stochastic optimization. Preprint at https://arxiv.org/abs/1412.6980 (2014).
Bevilacqua, V. et al. A novel BCI-SSVEP based approach for control of walking in virtual environment using a convolutional neural network. In Proc. IEEE International Joint Conference on Neural Networks (IJCNN) 4121–4128 (IEEE, 2014).
Volosyak, I. SSVEP-based Bremen–BCI interface—boosting information transfer rates. J. Neural Eng. 8, 036020 (2011).
Acknowledgements
W.-H.Y. acknowledges a grant from the Fundamental Research Program (project PNK5061) of Korea Institute of Materials Science, funding by the Nano-Material Technology Development Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT and Future Planning (no. 2016M3A7B4900044), and support from the Institute for Electronics and Nanotechnology, a member of the National Nanotechnology Coordinated Infrastructure, which is supported by the National Science Foundation (grant ECCS-1542174).
Author information
Authors and Affiliations
Contributions
M.M. and W.-H.Y. designed the research project; M.M., D.M., Y.-S.K., Y.L., R.H., A.D., S.M., C.S.A. and W.-H.Y. performed research; M.M., D.M., A.D., C.S.A. and W.-H.Y. analysed data; and M.M., D.M., C.S.A. and W.-H.Y. wrote the paper.
Corresponding author
Ethics declarations
Competing interests
W.-H.Y. and M.M. are the inventors on a pending US patent application.
Additional information
Publisher’s note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Supplementary Information
Supplementary Figs., Tables and notes
Supplementary Video 1
Supplementary Video
Supplementary Video 2
Supplementary Video
Supplementary Video 3
Supplementary Video
Rights and permissions
About this article
Cite this article
Mahmood, M., Mzurikwao, D., Kim, YS. et al. Fully portable and wireless universal brain–machine interfaces enabled by flexible scalp electronics and deep learning algorithm. Nat Mach Intell 1, 412–422 (2019). https://doi.org/10.1038/s42256-019-0091-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1038/s42256-019-0091-7
This article is cited by
-
Liquid-in-liquid printing of 3D and mechanically tunable conductive hydrogels
Nature Communications (2023)
-
Conformal in-ear bioelectronics for visual and auditory brain-computer interfaces
Nature Communications (2023)
-
Soft Electronics for Health Monitoring Assisted by Machine Learning
Nano-Micro Letters (2023)
-
Portable deep-learning decoder for motor imaginary EEG signals based on a novel compact convolutional neural network incorporating spatial-attention mechanism
Medical & Biological Engineering & Computing (2023)
-
Soft wearable devices for deep-tissue sensing
Nature Reviews Materials (2022)