Abstract
In this paper, we propose a multimodal flexible sensory interface for interactively teaching soft robots to perform skilled locomotion using bare human hands. First, we develop a flexible bimodal smart skin (FBSS) based on triboelectric nanogenerator and liquid metal sensing that can perform simultaneous tactile and touchless sensing and distinguish these two modes in real time. With the FBSS, soft robots can react on their own to tactile and touchless stimuli. We then propose a distance control method that enabled humans to teach soft robots movements via bare hand-eye coordination. The results showed that participants can effectively teach a self-reacting soft continuum manipulator complex motions in three-dimensional space through a “shifting sensors and teaching” method within just a few minutes. The soft manipulator can repeat the human-taught motions and replay them at different speeds. Finally, we demonstrate that humans can easily teach the soft manipulator to complete specific tasks such as completing a pen-and-paper maze, taking a throat swab, and crossing a barrier to grasp an object. We envision that this user-friendly, non-programmable teaching method based on flexible multimodal sensory interfaces could broadly expand the domains in which humans interact with and utilize soft robots.
Similar content being viewed by others
Introduction
Soft robots have attracted growing attention for their enormous potential in real-world applications1,2,3,4,5,6,7,8. Because they are highly conformable, soft robots have extraordinary advantages over rigid robots for safely interacting with humans in a wide range of environments9,10,11,12,13. However, because soft robots are challenging to model and program, non-specialists often face non-negligible obstacles when working with soft robots to achieve specific movements and perform certain tasks14,15,16,17,18. An interactive teaching method, which could efficiently and flexibly “teach” soft robots movement patterns, would dramatically benefit human users at home, on production lines, and in other unstructured environments (Fig. 1). Unlike rigid robots19,20,21, there are very few studies demonstrating the teaching of soft robots through human interaction. This is because there are two primary challenges to achieving soft robotic teaching through human interaction: the process requires (1) a multimodal, versatile, and robust flexible sensing device for interactions between a soft robot and human demonstrator; and (2) a user-friendly, non-programmable teaching method to transfer a human demonstrator’s instructions to the soft robots.
Regarding the first challenge, most previous studies have focused on tactile sensing for soft robots that can only respond to physical touch and not respond to touchless stimuli. Triboelectric nanogenerator (TENG), which harnesses the coupled effect of contact electrification and electrostatic induction, can transduce both tactile and touchless stimuli into electrical signals22,23,24,25,26. Triboelectric sensors based on TENGs have unique advantages for soft robots due to their wide-ranging material makeup (e.g., from low to high Young’s modulus), easily fabricated simple structure, high sensitivity, and fast response times27,28,29,30,31,32. Previous studies that utilized flexible triboelectric materials and structures have made remarkable progress in pressure and stress sensing33,34,35,36,37,38. Preliminary works that explore touchless sensing have also emerged39,40,41,42. However, because tactile and touchless stimulations result in identical trends in electric variation, it is challenging for triboelectric sensors to distinguish between tactile and touchless signals accurately in real-time43,44 (Supplementary Fig. 1A, B and Supplementary Movie 1). Thus, flexible triboelectric sensors capable of tactile and touchless real-time sensing remain to be researched, which may lay a research foundation for a new paradigm of soft robotic interactive teaching.
Regarding the second challenge, the interactive teaching of soft robots (e.g., the soft continuum manipulator) is little understood. Traditionally, the primary principle has been based on contact teaching for rigid robotic manipulators with few degrees of freedom45,46. This principle was commonly achieved by manually moving the manipulators under controlled, low-impedance modes while the manipulators’ encoders recorded the kinematics of the teaching process for replaying the motion. However, this type of contact teaching cannot be applied to soft robots for two reasons. First, the infinite degrees of freedom and compliant nature of a soft continuum manipulator make it challenging for a user to control explicitly, unlike the discrete configurations of a rigid manipulator47. Second, the contact-based teaching method for soft continuum robots produces passive deformation, and measuring these deformed configurations requires a large number of soft sensors (either embedded in or on the surface of the robot) to reconstruct the robot’s three-dimensional kinematics48,49. Given these challenges, is it possible to interactively teach soft robots through a flexible sensory interface? Can non-specialist users instruct soft robots to realize operational tasks in unstructured environments without programming?
Here, we develop a flexible bimodal smart skin (FBSS) with both tactile and touchless sensing by integrating a triboelectric sensor with a liquid metal sensor. The triboelectric sensor can respond to touchless stimulation, and the liquid metal sensor can respond to tactile stimulation. On this basis, the implemented FBSS can unambiguously distinguish between tactile and touchless modes in real time. We then characterize the sensing performance of the FBSS for both tactile and touchless sensing. Finally, we build a control framework for interactive teaching with the FBSS. We also propose a “shifting sensors and teaching” method for teaching complex locomotion, which involves moving FBSS to different locations on a manipulator during a teaching session. We show that a non-specialist can efficiently and interactively teach a continuum soft manipulator picking-and-placing, painting, throat-swabbing, and crossing a barrier to grasp an object. In addition, we also test the interactive performance of the FBSS on other soft robots including a soft origami robot and a robotic gripper.
Results
Working principle and sensing performance of FBSS
The flexible bimodal smart skin (FBSS) structure contains five flexible layers (Fig. 2a and “Methods” section). The flexible dielectric layer was fabricated by casting silicone rubber (Smooth-on, Dragon skin 00-20) in the mold with pyramid-shaped microstructures. The flexible electrode layer was fabricated with patterned Ag nanowire (NW) networks and was transferred by mixing polydimethylsiloxane (PDMS) base with a curing agent (Dow Corning, Sylgard184) at a typical weight ratio of 10:1. The stimulation layer, underneath the electrode layer, was fabricated using a similar method to that of the flexible dielectric layer. The surfaces of the flexible dielectric layer, flexible electrode layer, and stimulation layer formed chemical bonds after being treated with plasma (OPS plasma, CY-DT01). The liquid metal layer was first printed with a liquid metal printer (DREAM Ink, DP-1) and then the package layer (Smooth-on, Dragon skin 00-20) was used to transfer and contain the liquid metal. The stimulation layer was bonded to the package layer via a silicone rubber adhesive (Smooth-on, Sil-Poxy). The electron microscope image was taken for the fabricated pyramid-shaped microstructures, the height and width of the pyramid-shaped microstructures are 320 μm and 500 μm, respectively (Fig. 2b). The optical photo was taken for the printed liquid metal pattern, and the width of the liquid metal line is about 300 μm (Fig. 2c). The FBSS can be folded and stretched (maximum stretching rate is 58.4%), which demonstrates its excellent flexibility and stretchability (Fig. 2d, e).
The complete tactile and touchless perception principle of the FBSS is divided into multiple stages (Fig. 2f). During the initial stage (i), equal negative and positive charges are generated on the flexible dielectric layer and external object from different electron affinities after a few repeated contacts. These surface charges can remain for a sufficient time (over 1 h) for the interactive teaching process (Supplementary Fig. 2). At stage (ii), as the external object approaches the flexible dielectric layer, the electric potential between the electrodes and ground will be changed, which drive free electrons to flow from the ground to the flexible electrode, thus generating a current in the circuit. Note that the resistance of the liquid metal sensor remains stable as no contact pressure force acts on the FBSS during this stage. In stage (iii), the FBSS starts to deform with contact pressure from the external force that acts on the silicone rubber. The external object is closer to the flexible electrode during this stage, so the free electrons flow further from the ground to the flexible electrode and generate a current in the same direction. The liquid metal layer is compressed and the cross-sectional area of the liquid metal channel decreases, causing its resistance starts to increase. During stage (iv), when the external object is entirely in contact with the FBSS, the distance between the object and the flexible dielectric layer is compressed to the minimum. Charge neutralization occurs and the free electrons stop moving and the resistance of the liquid metal reaches the maximum. In stage (v), when external pressure is released, the free electrons flow back from the flexible electrode to the ground and generate a current in the opposite direction. The resistance of the liquid metal decreases with the recovery of the channel shape. In stage (vi), when the external object separates from the flexible dielectric layer, the number of electrons increases that flowing back to the ground, and generate a current in the same direction as the previous state. The resistance of the liquid metal remains stable as the disappearance of physical contact between the external object and FBSS. Finally, when the external object is far away from the FBSS, a new electrical equilibrium is established.
We implemented a measurement system to investigate the performance of the FBSS (Supplementary Fig. 3). The FBSS was fixed on a flat plate assembled on top of a force gauge (ATI Industrial Automation, mini40). The external object was attached to the end of the linear motor actuator (LinMot, E1100), which can cyclically approach and press the FBSS. A piece of glass, set 20 mm away from the FBSS, was used as the external object for the tactile and touchless sensing tests. The effect of approach distance on the output signals of the FBSS was first tested (Fig. 3a). The touchless output signal ΔU decreased exponentially from 11.35 to 0 V as the distance increased from 0 to 20 mm. The tactile output signal ΔR of the FBSS remained stable without variation. Electrostatic induction steadily weakened as the distance increased between the external object and the FBSS, and the output voltage decreased in step. We also studied the relationship between output signals and vertical pressure acting on the FBSS (Fig. 3b). As pressure increased from 0 to 30 kPa, the tactile output signal ΔR of the FBSS increased from 0 to 17.24 Ω and the touchless signal ΔU increased from 0 to 3.2 V. The cross-sectional area of the liquid metal channel decreased with increasing external pressure, which resulted in its resistance increase. With the pressure increase, the external object is closer to the FBSS. This also enhances the electrostatic induction between the external object and the FBSS. Because different materials have different electron affinities, material types can affect the surface charge density of the flexible dielectric layer. Therefore, the FBSS can be used for material identification (Fig. 3c). The tactile signal ΔR always remained at 0 kPa, while the touchless signal varied per material with a test distance of 20 mm. This allows the FBSS to distinguish between materials in real time. The dynamic response of the tactile sensing of the FBSS is about 120 ms, which is close to that of human skin (Supplementary Fig. 4A, B). The tactile and touchless signal noises are 0.04 Ω and 0.12 V, respectively (Supplementary Fig. 5A, B). The maximum signal-to-noise ratio (SNR) of the touchless signal and tactile signal are 94.58 and 431.03, respectively (Supplementary Fig. 5C, D). The maximum resolutions measured in the touchless and tactile experiments are 0.05 mm and 0.35 kPa, respectively (Supplementary Fig. 6A, B).
To evaluate how environmental factors affect the sensing performance of the FBSS, we experimentally tested the effects of temperature, humidity, and electromagnetic interference on the FBSS. The output touchless signal increases as the temperature increases from 15 to 30 °C, and then remains stable with further temperature increases (Fig. 3d). The output tactile signal remains almost invariant with an increase in temperature. We investigated the effect of humidity on the output signals of the FBSS (Fig. 3e). The touchless signal decreases gradually as humidity increases from 31.4 to 71.4%. The output tactile signal remains almost invariant with an increase in humidity. The touchless and tactile signals remain unchanged with an increase in electromagnetic interference (Fig. 3f). The long-term stability of the FBSS is also validated under an external pressure of 10 kPa and a distance of 20 mm. We measured outputs of the FBSS over 1200 cycles in the same condition (Fig. 3g). The results show no obvious waveform changes, which points to the long-term usage of the FBSS.
We performed a series of tests to verify the FBSS’s sensing ability when interacting with humans and the external environment. Firstly, the FBSS was used to detect the falling process of a tennis ball, which was placed above the FBSS at an initial height of 200 mm (Fig. 3h). A high-speed camera (Photron Ltd, FASTCAM Mini UX100) recorded the entire process at a sampling rate of 250 fps, while the FBSS recorded both tactile and touchless signals. The falling process was divided into three stages (Fig. 3i and Supplementary Movie 2). In stage (i), the tennis ball starts to fall from the initial height and approaches the FBSS. However, the tennis ball has not yet entered the detection range of the FBSS, so its output signals remain stable. In stage (ii), the tennis ball continues to fall and enters the FBSS’s detection range. The touchless signal ΔU decreases from 0 to −4.56 V and the tactile signal ΔR remains 0 Ω because the tennis ball has not yet come into contact with the FBSS. In stage (iii), the tennis ball contacts the FBSS. The touchless signal ΔU decreases further from −4.56 to −5.81 V and the tactile signal ΔR drastically increases from 0 to 0.83 Ω. In addition, we also show that the FBSS can perceive and distinguish the touchless distance of a feather falling through the air (Supplementary Movie 3).
We tested the tactile and touchless sensing ability of the FBSS on a human finger. The FBSS was connected to a sample circuit that controls two LEDs based on tactile (blue LED) and touchless (red LED) sensory feedback (Fig. 3j). We recorded the entire process of a finger approaching and pressing the FBSS (Fig. 3k and Supplementary Movie 4). During stage (i), the finger was 50 mm away from the FBSS’s surface, both LEDs were off. During stage (ii), as the finger approached the FBSS, the red LED lit up while the blue LED remained off. The recorded sensory data showed that the output touchless signal ΔU increased from 0 to 0.56 V while the output tactile signal ΔR remained unchanged. During stage (iii), the finger pressed on the FBSS, the blue LED lit up and the red LED’s brightness increased. This result intuitively demonstrates that the FBSS can perceive tactile and touchless information from a human finger.
Self-reacting soft robots equipped with FBSS
To equip a soft robot with the FBSS, we integrated the FBSS with a soft manipulator segment that could be bent and shortened. A human hand could touchlessly control the soft manipulator segment’s bending and shortening motions (Fig. 4a, b and Supplementary Movie 5). The soft manipulator segment was programmed to deform when the touchless output signal of the FBSS reached a predetermined threshold value (the control flowchart is provided in Supplementary Table 1). A soft origami robot equipped with an FBSS buried in the sand could perceive the approach of a robotic bug and grasp the bug by inflating its actuator (Fig. 4c and Supplementary Movie 6).
By integrating the FBSS with the tip of a soft robotic gripper, we endowed it with the ability to “search and grasp” objects through tactile and touchless sensing (Supplementary Fig. 7A, B and Supplementary Movie 7). The whole process can be divided into different stages (Fig. 4d, e). Initially, both the tactile signal ΔR and the touchless signal ΔU were negligible. As the rigid robotic arm moves horizontally and the gripper approaches the plastic cylinder, the touchless signal ΔU starts to rise and the tactile signal ΔR remains low. The threshold for the touchless signal to “identify” a target object was set to 0.1 V. Once the signal surpassed this level, the soft robot began to grip the target. The tactile signal ΔR rose and the touchless signal ΔU increased further until a stable grip was achieved. These experimental scenarios illustrate that the FBSS can effectively enable soft robotic interactions through tactile and touchless perception.
Interactive teaching of the soft manipulator
To further explore the more intelligent interaction between the soft robot and humans, we presented a flexible interface and interactive method with the FBSS (Fig. 5 and “Methods” section). Through the flexible interface and interactive method, we demonstrated that humans could interactively teach a soft manipulator to move in two-dimensional (2D) and three-dimensional (3D) space.
Based on the interactive teaching method, a user taught the soft manipulator to grasp an object in 2D space (Fig. 6a and Supplementary Movie 8). For a simpler explanation, we divide the teaching process into four steps. In step (i), we showed the user’s ability to control the initial length of the soft manipulator by altering the distance between the user’s hand and the FBSS. This step allowed the user to select an effective length of the soft manipulator in the first 5 s. In step (ii), the user touchlessly “bent” the soft manipulator by approaching the FBSS sensor with a hand. More specifically, the user chose to apply a strategy of multiple approaching-leaving actions to “bend” the soft manipulator in several large, discrete steps to move it towards the target object. During this process, the fluctuation range of the normalized touchless signal ΔU varied from 0 to almost 1. In contrast, the output tactile signal ΔR remained unchanged at nearly 0. In step (iii), when the soft manipulator approached the target position, the user switched from large steps to small steps to move the soft manipulator to the final few centimeters. Here, the normalized touchless sensory output fluctuated between 0.2 and 0.5. In step (iv), when the soft gripper reached the target object, the user pressed the FBSS to trigger the grasping motion. The drastic increases in the normalized touchless and tactile signals can be observed. Through our logic algorithm, the soft manipulator terminates the teaching task and closes the gripper when the tactile signal output exceeds a threshold value (Supplementary Table 2). Finally, the soft manipulator successfully gripped the target object and automatically returned to its initial position.
As the system simultaneously records the drive step size sequence of the soft manipulator in the teaching process, one can actuate the soft manipulator to repeat the movements and “replay” the entire taught motions. We compared the real-time driven air pressure during the teaching and repeating processes (Supplementary Fig. 8A, B). The two air pressure curves show almost identical changes during both processes. We demonstrated the interactive teaching results by showing the soft manipulator grasping objects at low, medium, and high positions (Supplementary Movie 9). The teaching process took 53, 53, and 59 s, respectively. The trajectories of the soft manipulator during the teaching and replaying phases coincided well with each other (Fig. 6b–d). One can also replay the teaching trajectories in a sped-up and slowed-down manner (Supplementary Movie 10), adding to the flexibility of the manipulator’s task execution.
We also performed interactive teaching of object-grasping in a constrained environment, where an obstacle was placed on the path of the soft manipulator (Supplementary Fig. 9A–E). The results show that the soft manipulator can successfully grasp an object within 40 s while encountering an obstacle that causes contact deformation (Supplementary Movie 11). In typical situations, enabling soft manipulators to work in a constrained environment requires a great deal of modeling and programming work. In contrast, no additional programming was required with the current interactive method.
We realized interactive teaching in 3D space by integrating two FBSSs on the soft manipulator (The logic algorithm is shown in Supplementary Table 3). The user interactively taught the soft manipulator to grasp an object out of the bending plane with both hands (Fig. 6e and Supplementary Movies 12, 13). The entire teaching process can be divided into four steps. In step (i), as with interactive teaching in planar space, the user applied a left hand to “bend” the soft manipulator in several large, discrete steps via multiple approaching-leaving actions (applied to FBSS I). The normalized touchless signal ΔU of FBSS I ranged from 0 to about 1. In step (ii), the user switched from a large to small step size to move the soft manipulator more slowly and position the soft gripper on the same horizontal level as the target object. The normalized touchless signal ΔU of FBSS I remained around 0.5 during this stage. In step (iii), the user applied a right hand to FBSS II, moving the soft manipulator out of the original plane. The end effector reached the target object after a few of these repeated right-hand approaching-leaving actions. The normalized touchless signal ΔU of FBSS II also ranged from 0 to about 1. Finally, in step (iv), once the soft gripper reached the target object, the user pressed the FBSS I to trigger grasping. We compared the real-time driven air pressure during both the teaching and repeating processes (Supplementary Fig. 10A, B). The two air pressure curves show almost identical changes during both processes. We also demonstrated that interactive teaching allows the soft manipulator to grasp objects in low, medium, and high positions, respectively (Supplementary Movie 14). These teaching processes took 51, 56, and 61 s, respectively. All three manipulator trajectories during teaching and repeating show excellent agreement (Fig. 6f–h). This result indicates that users can effectively teach the manipulator to move and perform actions in 3D space. Since the experiment, more than ten novices have been successful at interactively teaching the soft manipulator to grasp a target object in 3D space.
To demonstrate an intelligent FBSS placement strategy, we used an FBSS to control multiple motion modes for the soft manipulator. By changing the mounting position of the FBSS on the soft manipulator, it can be taught to move left, right, and backward (Supplementary Fig. 11 and Supplementary Movie 15). The trajectories of the soft manipulator during the teaching and replaying phases, as in the previous experimental procedure, coincided well with each other. To evaluate the usability of the interactive teaching method, we performed a teaching experiment with multiple participants, including two sophisticated experts (researchers of this project) and three novices with no experience with an interactive robotic system. We attached a laser pointer to the end of the soft manipulator to assess positioning accuracy (Supplementary Fig. 12A). The participants could control the position of the laser pointer on a target by touchlessly teaching the soft manipulator. We measured the positioning error after each participant’s teaching sessions ten times. The results showed that the positioning error for the experts was <10 mm for all ten trials, while the positioning error for the novices was relatively large in early trials (Supplementary Fig. 12B). Notably, after at most 8 attempts, all the participants’ positioning errors were less than 10 mm. This result suggests that non-specialists can quickly learn how to position the soft manipulator accurately through interactive teaching.
To enable the interactive teaching of the soft manipulator with even more complex locomotion, we proposed the “shifting sensors and teaching” method (Fig. 5a and Supplementary Fig. 13). Specifically, the FBSS was placed on a flexible, arc-shaped patch with three magnets behind it. Several small magnetic cylinders were placed around the bottom of each segment of the soft manipulator. With the magnetic attachment, the FBSS can be shifted to different positions on the soft manipulator in a rapid, accurate manner. Therefore, the human demonstrator can select a segment for interaction, easily shift the FBSS patch to the corresponding segment and then teach the soft manipulator in a touchless manner. Thus, we name this method “shifting sensors and teaching”.
With the proposed “shifting sensors and teaching” method, we show the interactive teaching of the soft manipulator with complex locomotion in 2D and 3D spaces. The normalized touchless and tactile signals of FBSS I and FBSS II are also plotted against time (Fig. 7).
With the “shifting sensors and teaching” method, a user interactively taught the soft manipulator to achieve a 2D “S” shape (Fig. 7a and Supplementary Movie 16). In step (i), two FBSSs were placed on the bottom of the third segment of the soft manipulator. When the demonstrator’s two hands approached the two FBSSs simultaneously, all three segments of the soft manipulator shortened and entered the teaching mode. In step (ii), the demonstrator shifted the FBSS I to the right side of the first segment and then used their right hand to bend the first segment to the left. Then the demonstrator pressed the FBSS I to lock the first segment (iii). In step (iv), the demonstrator shifted the FBSS II to the second segment’s left side, used their left hand to bend the second segment to the right, and then pressed FBSS II to lock the second segment. In step (v), the FBSS I was shifted to the right side of the third segment. The demonstrator used the right hand to bend the third segment to the left then pressed FBSS I to lock the third segment and finished the touchless teaching session. According to this method, we realized a planar “S”-shaped configuration of the soft manipulator using the “shifting sensors and teaching” method by shifting the FBSS sensors three times.
We show an interactive teaching session involving complex locomotion in 3D space by applying the “shifting sensors and teaching” method (Fig. 7b and Supplementary Movies 17, 18). In step (i), the soft manipulator was triggered to enter the teaching mode. In step (ii), the demonstrator shifted the FBSS II to the right side of the first segment and then used the right hand to bend the first segment to the left. Then the demonstrator pressed the FBSS II sensor to “lock” the first segment in the current direction (iii). In step (iv), FBSS I was shifted to the back of the first segment, and the right hand “bent” the soft manipulator to move outward. Then the first segment was “locked” by pressing the FBSS I. In step (v), the FBSS II was shifted to the left side of the second segment. The demonstrator used the left hand to bend the second segment to the right and “locked” the second segment in the current direction by pressing the FBSS II. In step (vi), the FBSS I was shifted to the front of the second segment. Then the demonstrator used the right hand to bend the second segment inward and “locked” the second segment by pressing the FBSS I. In the final step (vii), the FBSS II was shifted to the left side of the third segment. The demonstrator used the left hand to bend the third segment toward the right then pressed FBSS II to lock the third segment, and finished the touchless teaching session. Thus, we realized a complex 3D configuration (note that all nine chambers of the soft manipulator were involved) of the soft manipulator using the “shifting sensors and teaching” method by shifting the FBSS sensors five times. These teaching processes took 197 and 350 s, respectively. The experimental results show that the “shifting sensors and teaching” method is simple and effective in enabling complex 3D configurations of soft continuum robots.
We show that a human can interact closely with the soft manipulator to complete another challenging task. The watercolor pen was installed at the end of the manipulator (Fig. 8a). With this setup, we taught the manipulator to execute movements to “navigate” a maze on paper (Fig. 8b and Supplementary Movie 19). The soft manipulator repeated the trace after teaching (Fig. 8c). The output signals of FBSS I and II were recorded over the ∼240 s teaching period (Fig. 8d).
We also show the manipulator’s ability to perform a critical task in the context of public health. As the coronavirus pandemic continues to range around the world, throat swabs have become a common practice for medical testing. However, this undoubtedly has put a burden on medical workers who risk infection during the collection process. To address this issue, we used the interactive system to teach the soft manipulator to take a throat swab (Fig. 8e, f and Supplementary Movie 20). First, a cotton swab was installed at the end of the soft manipulator. The user could then touchlessly bend the first two segments of the soft manipulator by hand to control the position of the swab. Once the swab reached the target position, the user elongates the soft manipulator’s third segment by pressing the FBSS to and collect the throat swab sample. The intelligent interactive system is simple enough for medical workers to use without extensive training, and the soft manipulator can repeat the action autonomously after only a single teaching process. Compared with traditional rigid robots, soft manipulators are inherently safer for human interaction, because of their soft materials and compliant structures. The real-time driven air pressure was compared during the teaching and repeating processes (Supplementary Fig. 14). The two air pressure curves show almost identical changes during both processes.
Finally, we show that the soft manipulator can be “taught” to cross a barrier and successfully grasp an artificial flower by shifting the FBSSs five times (Fig. 8g and Supplementary Movies 21, 22). To cross the barrier, we touchless controlled the third segment to bend outward (i) and the first segment to shorten. Then the second segment was bent to the right (ii), and the third segment was bent upward (iii) and inward. To grasp the flower, the third segment was bent downward (iv) and the gripper grasped the flower by pressing the FBSS II (v), and the whole process lasted about 318 s. The experimental result shows the advantages of the “shifting sensors and teaching” method in the practical application of soft robot multi-degree-freedom control, and provides a new scheme for multi-degree-freedom control of the soft robot.
Discussion
In this paper, we developed a flexible bimodal smart skin (FBSS) prototype that responds to tactile and touchless stimulations and distinguishes between the two modes in real time. With the FBSS as an interface, we proposed a human-soft robot touchless interactive teaching method and systematically tested this method on a continuum soft manipulator. This interactive teaching method for executing complex motions was realized intuitively via bare-handed touchless approaches to the FBSS. Using this method, we successfully taught a “naive” soft manipulator to move in 3D space and perform simple tasks such as painting, taking a throat swab, and crossing a barrier to grasp an object. We envision that this interactive teaching method may expand the practical uses of soft robots, as it allows non-specialists to operate the soft robot for various tasks without expert familiarity.
In terms of other relevant sensing methods, there are a few other sensors that can detect tactile and touchless stimulations. For example, magnetic bimodal skin relies on the giant magnetoresistance (GMR) effect via a magnetic film with a pyramid-shaped extrusion at its top surface50. When the giant magnetoresistance sensor detects a magnetic field around the film, its resistance changes. However, this sensor requires objects contacting the film to be magnetic, so the material properties of the sensed objects are quite limited. In contrast, a triboelectric sensor can detect a wide range of materials.
The high flexibility and sensitivity of triboelectric electronic skin enabled the soft robot to have both sensing and interaction capabilities. Flexible triboelectric skin, which mainly includes flexible electric and dielectric films51,52, can convert tactile and touchless stimulations into voltage signals through contact electrification and electrostatic induction, respectively. It should be noted that the tactile and touchless stimulations result in the identical trend of electric variation, it is hard for triboelectric skins to distinguish those two modes in real-time43,44. By combining triboelectric and liquid metal sensory mechanisms, our FBSS can simultaneously sense both tactile and touchless information and can distinguish the two modes in real time. Since the FBSS is flexible and stretchable, it is suitable for large deformation and can be used for soft robotic sensing. In addition, the FBSS can sense a wide range of materials during interaction (Fig. 3c). We compared the FBSS with other tactile/touchless sensors (Supplementary Table 4). In the future, the FBSS sensing accuracy can be further improved by optimizing the microscopic pyramid structure and the liquid metal channels.
There are few previous reports on the human interactive teaching of soft robots. Our proposed methodology has the following unique features for teaching movement: (1) the method is based on a touchless, near-distance control via the natural hand-eye coordination of the human participants, making it intuitive and straightforward; and (2) the teaching result is quite effective in terms of time and accuracy. With the touchless interactive teaching method, most 3D movements of the soft manipulator were finished within a few minutes.
The “contact” teaching approach for traditional rigid robotic manipulators (i.e., achieving compliant behavior with a robot’s end effector in response to forces exerted by a human operator) is problematic for teaching soft robots46. Capturing the passive deformation of soft continuum manipulators, for example, when manipulated by a human hand requires many soft sensors, which complicates shape and configuration reconstruction later on. In contrast, human subjects without any robotic teaching skills or experience have validated the practicality and effectiveness of this teaching method. In addition, recording and analyzing the touchless/tactile information during the teaching process and replaying the soft robot’s movements can help illustrate how humans prefer to touchless interact with robots. A record of the hand movements, particularly when approaching the robot, of different individuals would further help to refine the interactive system.
In terms of the limitations of this research, we used two soft sensors to create an interactive interface in the present study. In future work, we will incorporate more interactive soft sensors into the soft manipulator to enable more complex controls of the robot’s shape and to capture more feedback. Furthermore, developing multi-sensor arrays with FBSS and incorporating emerging machine-learning tools would enable more complex robotic motions with different morphologies established through touchless interaction. For example, collect massive sensory data for ML training to recognize human gestures and environmental objects for soft robots. In addition, the ability to distinguish between a human hand and an object or obstacle with FBSS in a complex environment would further complement the current teaching method.
In this study, we used a pneumatically actuated soft robot, which is simple, repeatable, and robust. Responsive materials can allow soft robots to actuate through a variety of stimuli, such as light, magnetic fields, electricity, and chemicals53, and soft material structures such as origami and metamaterials can enable complex movement through a touchless interactive teaching method54. We envision that interactive soft robots can work collaboratively with an increasing number of human participants in a wide variety of disciplines.
Methods
Fabrication process of the flexible electrode layer
A patterned Kapton film was first affixed to a clean silicon wafer as a shadow mask (Supplementary Fig. 15A). The Ag NW network solution was then sprayed onto the wafer and the solvent evaporated at 60 °C for 15 min. A thin silicone rubber layer was spin-casted onto the wafer and cured at 60 °C for 4 h. The cured silicone rubber was then carefully peeled off from the wafer and the Ag NW network was transferred onto the silicone rubber.
Fabrication process of the flexible dielectric layer
The SLA mold with micro-pyramid caves (depth = 320 μm; width = 500 μm) was first printed by a micro-precision 3D printer (Supplementary Fig. 15B). The silicone rubber (Smooth-on, Dragon skin 00–20) was then drop-casted onto the mold and cured at room temperature for 4 h. The cured silicone rubber was carefully peeled off from the mold and the micro-pyramid structures were transferred onto it.
Fabrication process of the liquid metal patch
Using a liquid metal printer (DREAM Ink, DP-1), the patterned liquid metal was printed on the plastic substrate (Supplementary Fig. 16). The silicone rubber (Smooth-on, Dragon skin 00-20) was drop-casted onto the patterned liquid metal and then cured at room temperature for 4 h. To transfer the patterned liquid metal from the substrate onto the silicone rubber, they were placed in a refrigerator at −140 °C for 40 min. The cured silicone was carefully peeled off from the substrate and the patterned liquid metal was embedded in the silicone. Additional silicone rubber (Smooth-on, Dragon skin 00-20) was drop-casted onto another side of the patterned liquid metal and cured at room temperature for 4 h.
Implementation and control of the interactive soft manipulator
The soft manipulator is designed and fabricated for grasping objects (Supplementary Fig. 17). The soft manipulator primarily consists of three soft actuator modules and a soft gripper as the end effector (Fig. 5a). The soft manipulator has 10 pneumatic chambers and each bending segment has 3 chambers (with 3 bending segments in total). The end effector, a four-fingered soft gripper, is actuated by a single air inlet.
During the teaching process, the motion of the soft manipulator follows a kinematic model under the hypothesis of piecewise constant curvature (PCC). We consider the soft manipulator’s shape to be composed of a fixed number of segments with constant curvature. The kinematic model transforms from configuration space (arc parameters κi, θi, φi) to actuation space (chamber pressure pij) More specifically, the indices i = 1, 2, 3 and j = 1, 2, 3 refer to the ith segment and the jth chamber, respectively. We define the arc parameter κi as the curvature radius of the ith segment, θi as the curvature angle around the y-axis, and φi as the deflection angle around the z-axis (Fig. 5b, c). The constant parameter d could be measured before initiating actuation. First, we solve the chamber length lij from given arc parameters κi, θi, φi (ri = κi−1), shown in Eq. (1)
Then adding in calibrated pressure-length relations (Supplementary Fig. 18), we can calculate the actuating pressure pij from the chamber length lij to complete the model-based control.
Interactive control of soft manipulator based on the FBSS interface
The teaching process starts with a human hand approaching the FBSS, and the FBSS transforms the distance information into a voltage signal (Fig. 5d and Supplementary Table 2). First, we normalize the voltage signal through Eq. (2), with a sampling frequency of 100 Hz:
where Sout is the normalized voltage signal, V is the measured voltage of the FBSS, Vmax is the maximum voltage output of the FBSS, and Vinit is the initial voltage output of the FBSS. Then, we use a 10-time sampling frequency mean filter to remove noise from the signal. To obtain a variable step length from the FBSS feedback signal, we implement the hyperbolic tangent function as the mapping of the step length (Fig. 5e), shown in Eq. (3):
where θh is the calculated step length, hinit is the initial step length, and k1, k2 are the parameters of the hyperbolic tangent function. Then, the step length is added to the current bending angle θ0 in Eq. (4):
The bending angle θ is substituted into the kinematic model to solve the chamber pressures pij of the soft manipulator, and the multi-channel pneumatic system executes actuation with pressures pij. Finally, when the FBSS registers physical contact, the soft manipulator triggers the gripper to complete the gripping motion.
During the contact-free interaction process, the end effector of the soft manipulator can be guided by the motion of the human hand in 3D space. The position of the end effector can be determined by the operator’s vision.
We evaluated the teaching error under different step length control strategies. We mounted a laser transmitter on the end effector of the soft manipulator to track the position of the end effector. During the entire interaction process, time cost and position error are recorded, and each test is repeated 5 times. The average execution time and position error of the variable step length strategy are 24.3 s and 4.6 mm, respectively (Fig. 5f). Thus, we adopted the variable step length strategy for interactive teaching. The workspace of the soft manipulator is 568 mm in length, 591 mm in width, and 334 mm in height (Fig. 5g). This guarantees a wide range of interactive locomotion for a human user.
Electrical characterization
The resistance of the FBSS was measured with a synchronous data acquisition card (National Instruments, USB-6356). The voltage, current, and transfer charges were measured using an electrometer (Tektronix Inc., Keithley 6514).
Data availability
The data generated in this study are provided in the Source data file. Source data are provided with this paper.
References
Hsiao, J.-H., Chang, J. Y. & Cheng, C. M. Soft medical robotics: clinical and biomedical applications, challenges, and future directions. Adv. Robot. 33, 1099–1111 (2019).
Hu, W., Lum, G. Z., Mastrangeli, M. & Sitti, M. Small-scale soft-bodied robot with multimodal locomotion. Nature 554, 81–85 (2018).
Walsh, C. Human-in-the-loop development of soft wearable robots. Nat. Rev. Mater. 3, 78–80 (2018).
Kim, Y., Parada, G. A., Liu, S. & Zhao, X. Ferromagnetic soft continuum robots. Sci. Robot. 4, 1–16 (2019).
Li, G. et al. Self-powered soft robot in the Mariana Trench. Nature 591, 66–71 (2021).
Thandiackal, R. et al. Emergence of robust self-organized undulatory swimming based on local hydrodynamic force sensing. Sci. Robot. 6, eabf6354 (2021).
Naclerio, N. D. et al. Controlling subterranean forces enables a fast, steerable, burrowing soft robot. Sci. Robot. 6, eabe2922 (2021).
Li, L. et al. Aerial-aquatic robots capable of crossing the air-water boundary and hitchhiking on surfaces. Sci. Robot. 7, eabm6695 (2022).
Xie, Z. et al. Octopus arm-inspired tapered soft actuators with suckers for improved grasping. Soft Robot. 7, 639–648 (2020).
Gu, G. et al. A soft neuroprosthetic hand providing simultaneous myoelectric control and tactile feedback. Nat. Biomed. Eng. https://doi.org/10.1038/s41551-021-00767-0 (2021).
Wehner, M. et al. An integrated design and fabrication strategy for entirely soft, autonomous robots. Nature 536, 451–455 (2016).
Gong, Z. et al. A soft manipulator for efficient delicate grasping in shallow water: modeling, control, and real-world experiments. Int. J. Rob. Res. 40, 449–469 (2021).
Kim, W. et al. Bioinspired dual-morphing stretchable origami. Sci. Robot. 4, eaay3493 (2019).
Coevoet, E. et al. Software toolkit for modeling, simulation, and control of soft robots. Adv. Robot. 31, 1208–1224 (2017).
Rucker, D. C. & Webster, R. J. in Springer Tracts in Advanced Robotics Vol. 79, 645–654 (IEEE, 2014).
Rucker, D. C., Jones, B. A. & Webster, R. J. III A geometrically exact model for externally loaded concentric-tube continuum robots. IEEE Trans. Robot. 26, 769–780 (2010).
Trivedi, D., Lotfi, A. & Rahn, C. D. Geometrically exact models for soft robotic manipulators. IEEE Trans. Robot. 24, 773–780 (2008).
Hannan, M. W. & Walker, I. D. Kinematics and the implementation of an elephant’s trunk manipulator and other continuum style robots. J. Robot. Syst. 20, 45–63 (2003).
Pan, Y., Chen, C., Li, D., Zhao, Z. & Hong, J. Augmented reality-based robot teleoperation system using RGB-D imaging and attitude teaching device. Robot. Comput. Integr. Manuf. 71, 102167 (2021).
Weng, C., Yuan, Q., Suarez-Ruiz, F. & Chen, I. A telemanipulation-based human–robot collaboration method to teach aerospace masking skills. IEEE Trans. Ind. Inform. 16, 3076–3084 (2020).
Du, G., Yao, G., Li, C. & Liu, P. An offline-merge-online robot teaching method based on natural human-robot interaction and visual-aid algorithm. IEEE/ASME Trans. Mechatronics 1–12, https://doi.org/10.1109/TMECH.2021.3112722 (2021).
Fan, F. R. et al. Transparent triboelectric nanogenerators and self-powered pressure sensors based on micropatterned plastic films. Nano Lett. 12, 3109–3114 (2012).
Yang, Y. et al. Human skin based triboelectric nanogenerators for harvesting biomechanical energy and as self-powered active tactile sensor system. ACS Nano 7, 9213–9222 (2013).
Wang, Z. L. Triboelectric nanogenerators as new energy technology for self-powered systems and as active mechanical and chemical sensors. ACS Nano 7, 9533–9557 (2013).
Ren, Z. et al. Fully elastic and metal-free tactile sensors for detecting both normal and tangential forces based on triboelectric nanogenerators. Adv. Funct. Mater. 28, 1802989 (2018).
Zhao, G. et al. Transparent and stretchable triboelectric nanogenerator for self-powered tactile sensing. Nano Energy 59, 302–310 (2019).
Wang, Z. L. Triboelectric nanogenerators as new energy technology and self-powered sensors—principles, problems and perspectives. Faraday Discuss. 176, 447–458 (2014).
Wang, Z., Chen, J. & Lin, L. Progress in triboelectric nanogenertors as new energy technology and self-powered sensors. Energy Environ. Sci. 8, 2250–2282 (2015).
Yao, G. et al. Bioinspired triboelectric nanogenerators as self‐powered electronic skin for robotic tactile sensing. Adv. Funct. Mater. 30, 1907312 (2020).
Jin, T. et al. Triboelectric nanogenerator sensors for soft robotics aiming at digital twin applications. Nat. Commun. 11, 1–12 (2020).
Zhang, S. et al. Nondestructive dimension sorting by soft robotic grippers integrated with triboelectric sensor. ACS Nano 16, 3008–3016 (2022).
Chen, J. et al. Soft robots with self-powered configurational sensing. Nano Energy 77, 105171 (2020).
Luo, J. et al. Flexible and durable wood-based triboelectric nanogenerators for self-powered sensing in athletic big data analytics. Nat. Commun. 10, 5147 (2019).
Liu, Z. et al. Transcatheter self-powered ultrasensitive endocardial pressure sensor. Adv. Funct. Mater. 29, 1–10 (2019).
Zou, Y. et al. A bionic stretchable nanogenerator for underwater sensing and energy harvesting. Nat. Commun. 10, 1–10 (2019).
Liu, Y. et al. Thin, skin‐integrated, stretchable triboelectric nanogenerators for tactile sensing. Adv. Electron. Mater. 6, 1901174 (2020).
He, J. et al. Trampoline inspired stretchable triboelectric nanogenerators as tactile sensors for epidermal electronics. Nano Energy 81, 105590 (2021).
Wu, M. et al. Thin, soft, skin-integrated foam-based triboelectric nanogenerators for tactile sensing and energy harvesting. Mater. Today Energy 20, 100657 (2021).
Zhao, J. et al. Flexible organic tribotronic transistor for pressure and magnetic sensing. ACS Nano 11, 11566–11573 (2017).
Bu, T. et al. Stretchable triboelectric-photonic smart skin for tactile and gesture sensing. Adv. Mater. 30, 1800066 (2018).
Wu, H. et al. Self-powered noncontact electronic skin for motion sensing. Adv. Funct. Mater. 28, 1–10 (2018).
Shi, M. et al. Self-powered analogue smart skin. ACS Nano 10, 4083–4091 (2016).
Lai, Y.-C. et al. Actively perceiving and responsive soft robots enabled by self-powered, highly extensible, and highly sensitive triboelectric proximity- and pressure-sensing skins. Adv. Mater. 30, 1801114 (2018).
Chen, S., Pang, Y., Yuan, H., Tan, X. & Cao, C. Smart soft actuators and grippers enabled by self‐powered tribo‐skins. Adv. Mater. Technol. 5, 1901075 (2020).
Du, G., Chen, M., Liu, C., Zhang, B. & Zhang, P. Online robot teaching with natural human–robot interaction. IEEE Trans. Ind. Electron. 65, 9571–9581 (2018).
Ficuciello, F., Villani, L. & Siciliano, B. Variable impedance control of redundant manipulators for intuitive human–robot physical interaction. IEEE Trans. Robot. 31, 850–863 (2015).
Rus, D. & Tolley, M. T. Design, fabrication and control of soft robots. Nature 521, 467–475 (2015).
Shah, D. et al. Shape changing robots: bioinspiration, simulation, and physical realization. Adv. Mater. 33, 2002882 (2021).
Shih, B. et al. Electronic skins and machine learning for intelligent soft robots. Sci. Robot. 5, eaaz9239 (2020).
Ge, J. et al. A bimodal soft electronic skin for tactile and touchless interaction in real time. Nat. Commun. 10, 4405 (2019).
Guo, H. et al. Self-sterilized flexible single-electrode triboelectric nanogenerator for energy harvesting and dynamic force sensing. ACS Nano 11, 856–864 (2017).
Lu, X. et al. Stretchable, transparent triboelectric nanogenerator as a highly sensitive self-powered sensor for driver fatigue and distraction monitoring. Nano Energy https://doi.org/10.1016/j.nanoen.2020.105359 (2020).
Rich, S. I., Wood, R. J. & Majidi, C. Untethered soft robotics. Nat. Electron. 1, 102–112 (2018).
Laschi, C., Mazzolai, B. & Cianchetti, M. Soft robotics: technologies and systems pushing the boundaries of robot abilities. Sci. Robot. 1, 1–12 (2016).
Acknowledgements
This work was supported by the National Science Foundation support projects, China (Grant Nos. 91848206, 92048302, T2121003 received by L.W.), and the National Key R&D Program of China (Grant Nos. 2018YFB1304600, 2019YFB1309600, 2020YFB1313003 received by L.W.). We want to thank Zhexin Xie, Shiqiang Wang, Shanshan Du, and Chuqian Wang for their assistance in this work.
Author information
Authors and Affiliations
Contributions
W.L. and L.W. conceived the idea, W.L., Y.D., J.L., F.Y., C.Z., Y.W, B.F., F.S., X.D., and L.W. analyzed the data and wrote the paper. W.L. designed and fabricated the FBSS. Y.D. and J.L. designed and fabricated the soft manipulator. W.L., Y.D., J.L., and H.Y. implemented the interactive teaching system. Lei L., G.W., B.C., S.W., Luchen L., H.Y., and Y.L. conducted the experiments. Lei L., Y.M., and W.L. draw and optimized the figures, tables, and videos.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Nature Communications thanks Huichan Zhao, and the other, anonymous, reviewer(s) for their contribution to the peer review of this work. Peer reviewer reports are available.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Source data
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Liu, W., Duo, Y., Liu, J. et al. Touchless interactive teaching of soft robots through flexible bimodal sensory interfaces. Nat Commun 13, 5030 (2022). https://doi.org/10.1038/s41467-022-32702-5
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41467-022-32702-5
This article is cited by
-
Mechatronic automatic control system of electropneumatic manipulator
Scientific Reports (2024)
-
Bioinspired handheld time-share driven robot with expandable DoFs
Nature Communications (2024)
-
Laser-induced graphene based triboelectric nanogenerator for smart electronic device
Journal of Materials Science: Materials in Electronics (2024)
-
A sensory memory processing system with multi-wavelength synaptic-polychromatic light emission for multi-modal information recognition
Nature Communications (2023)
-
Recent advances in high-performance triboelectric nanogenerators
Nano Research (2023)