Touch is the first sense that humans develop before birth. It is an intimate, emotional way to communicate and can convey a lot of information.

I study touch in the context of human–robot interaction, for my PhD programme at Cornell University in Ithaca, New York. Communication through touch is important for social and companion robots.

We developed a soft robot ‘skin’ that enables touch-based interaction. Our robots can communicate through alterations to the the shape, size and motion of textures on their skin. We can create goosebumps, like those that appear on the skin of someone who’s excited. We can also create spikes, inspired by porcupinefish, which puff into a spiky ball when they’re angry.

In this photograph, I am holding a robot skin-wrinkle module, which mimics the way a human forehead can wrinkle up. We have made tentacled robots, like the blue robot in the picture. It brings together sighted and visually impaired children for a storytelling activity; each child touches the tentacles on a robot arm, and the tentacles’ movements are mapped to the emotions of a character.

We are exploring a texture-changing steering wheel for autonomous cars, to enable a more emotional connection between driver and car. Driving is usually a rigid human–machine interaction. By contrast, when you are riding a horse, you can feel its emotions through the physical connection, which we’re trying to recreate: the steering wheel might convey calm on a quiet road, and focus in heavy traffic.

As companion robots, such as the striped robot in the photograph, move into our homes, their camera vision systems might raise privacy concerns. We have developed a mobile robot that maintains privacy by relying on touch. The robot’s camera is inside its body, beneath a translucent skin that clouds its view. When a hand is placed on the robot’s skin, the camera can recognize gestures, from a ‘go away’ tap to a ‘move closer’ stroke, from the hand’s shadow.