Balancing user and robotic control
Brain–machine interfaces can augment human capabilities and restore functions. In the past decade, advances in materials engineering, robotics and machine learning are opening up new possibilities in this area. In work by Katy Z. Zhuang et al. a robotic hand prosthesis is developed that allows not only user-controlled movement but also assisted grasping in a shared control scheme. This is accomplished by first decoding myoelectric signals with a machine learning method to control individual fingers. This proportional control of fine movements is combined with an algorithmic controller to assist stable grasping by maximizing the area of contact between a prosthetic hand and an object. Elsewhere in this issue, Musa Mahmood et al. demonstrate a portable, wireless, flexible scalp electroencephalography system, implementing state-of-the-art flexible electronics approaches and convolutional neural networks for real-time neural signal classification. In our Editorial, we look at some of the history of brain–machine interfaces, going back to Norbert Wiener’s cybernetics.
See Zhuang et al., Mahmood et al. and Editorial