Robotics: The bot that plays ball

Article metrics

He looks like a child and plays like a child. But can the iCub robot reveal how a child learns and thinks? Nicola Nosengo reports.

Download a PDF of this story

Giulio Sandini cannot help smiling as his child reaches out a hand and tries to grasp the red ball that Sandini keeps waving before his eyes. "He is getting really good at it," he says, with the proud tone of any father. True, most fathers would expect more from their three-year-old than the ability to grasp a ball. But Sandini is indulgent: although the object of his affection has the wide eyes and rounded cheeks of a little boy, he is, in fact, a robot.

His name is iCub or, as the team calls him, iCub Number 1. Together with his brothers now in laboratories around the world, this little robot may help researchers to understand how humans learn and think. Grasping a ball is only a first step, says Sandini, director of the robotics and cognitive-sciences department at the Italian Institute of Technology (IIT) in Genova, and head of the child-robot project since it started in 2004. Sandini is confident that iCub will learn more and more tricks — until, in the end, he is even able to communicate with humans.

"We wanted to create a robot with sufficient movement capabilities to replicate the learning process a real child goes through" as it develops from a dependent, speechless newborn into a walking, talking being, Sandini says. So he and his colleagues have not only given iCub the hands, limbs and height of a toddler, they have also tried to give him the brain of one — a computer that runs algorithms allowing iCub to learn and develop as he interacts with his surroundings.

In a child, says Luciano Fadiga, a neurophysiologist at Italy's University of Ferrara who is part of the team that developed iCub, those interactions are essential for shaping the rapidly growing brain. Before children can grasp a moving ball, for example, they must learn to coordinate head and eye movements to keep the ball in their visual field; use visual clues to predict the ball's trajectory and guide their hand; and close their fingers on the ball with the right angle and strength. None of these abilities is there at birth, and children cannot grasp appropriately until they reach around one year of age. "Many theories try to explain what happens in the brain as it learns all this stuff," says Fadiga, "and the only way to test them is to see what works best in an artificial system."

“This is not a car you just buy and start to drive around; we're in totally new ground. Paul Verschure ”

Such testing is certainly not new. Cognitive scientists have been using computer models to simulate mental processes since the 1950s, including algorithms that mimic learning. But many of these simulations have focused on the high-level, conscious reasoning used to solve logical puzzles, play chess or make medical diagnoses. And many others — notably 'neural network' models — have simulated neurons. But Sandini and Fadiga are among the many researchers who have come to think that both types of simulations leave out something essential: the body.

"There is ever-growing evidence from neuroscience that visuo–motor processing, and manipulation in particular, are crucial for higher cognitive development, including social behaviour and language," Sandini says.

It was this line of thinking that led Sandini and his co-workers to their central hypothesis — that the best way to model the human mind would be to create a humanoid robot that is controlled by realistic learning algorithms, then let it explore the world as a child would. They gathered together scientists from 11 European universities and research institutions to form the RobotCub project, and began work with €8.5 million (US$12 million) in funding from the European Union. The IIT is the project's leading partner, and it is here that iCubs are born.

Form and function

Researchers can already choose from a list of robots that includes Khepera, a simple and affordable wheeled robot built by a Swiss consortium and used to study locomotion, and humanoid robots such as HRP-2, PINO and ASIMO, all built in Japan. But Sandini's ambition was to create a humanoid robot that combined unprecedented mechanical versatility with open-source software, so that researchers could change both the hardware and the algorithms as needed.

"We started from the hand, and built the rest of the robot around it," Sandini says. With seven degrees of freedom in the arm and nine in the hand, and its mechanical shoulders, elbows, wrists and fingers controlled by electric motors, iCub's arm is by itself a robotic marvel that took years to perfect. Next, iCub needed human-like senses. Project engineers gave him stereoscopic vision through two cameras mounted on moving ocular bulbs, complete with eyelids that close every now and then. And they gave him touch through sensors on his arms that can detect pressure applied from outside. They are also developing an artificial skin that will allow the robot to detect an object's shape and surface properties.

Other team members tried to figure out what should happen inside iCub's brain. They decided to give him a few innate abilities similar to those seen in newborns, such as recognizing a face as being human and detecting objects against a background. Everything else would have to be learned. After reviewing evidence from neuroscience, psychology and animal studies, they came up with a three-level software architecture, mostly designed by Giorgio Metta of the IIT and by David Vernon of the Khalifa University of Science, Technology and Research in Sharjah, United Arab Emirates.

Giulio Sandini (left) and Giorgio Metta gradually pieced together a robot with an unprecedented level of dexterity and coordination. Credit: L. NATALE; L. NATALE; A MALDONADO HERRERA

The first level gathers information on what iCub sees and feels. It collects raw signals from the cameras and other sensory systems, and channels them through a set of filters to determine which signals are most salient — a process similar to the human attention system. The second level is a kind of traffic director called the 'modulation' system. Loosely based on the functions of the hippocampus, basal ganglia and amygdala, this system takes in data from the lower level as the robot tries to grasp a ball, say, compares those data to combinations of action and sensory information iCub has encountered before, then decides what the robot should do next. In doing so, the modulation system is driven by some basic motivations, corresponding to a child's curiosity for new stimuli and tendency to engage in social interactions.

The third level uses prior experience to play 'what if?' with the current situation. What will happen if I move towards the ball with this force and this angle? What if the ball moves in the meantime? This information goes back down to the middle level to help determine the robot's next action.

It was only after almost three years of effort that the team finally activated its first complete robot. Their artificial child could now move, see and touch, which was all the researchers technically needed. But, in a symbolic gesture of whimsy, they decided they should also make it look like a child. The undeniably cute result was a semi-transparent mask with luminous colour light-emitting diodes under the surface to outline eyebrows and a mouth, allowing the robot to smile and frown. This engaging face may have more uses than just making the robot look good in promotional pictures, says Fadiga. "In the future, some groups plan to try iCub with children who are autistic, testing their reactions to his expressions and movements".

iCub Number 1 was never meant to be an only son. After the first robot became operational, the consortium issued an open call for proposals to conduct experiments. The six winners, chosen by an independent panel appointed by the consortium and the European Union, have received their own iCub for free. And anyone else can order one for the cost of producing it, some €180,000–200,000. "It was part of the deal with the European Union that we should provide a number of robots to interested groups," Sandini says. This way, the team hopes to create a de facto standard in robotics, facilitating data exchange. "There is a desperate need for standardization in our field," says Paul Verschure, a technology professor at the Catalan Institute of Advanced Research in Barcelona, Spain, and one of those selected to receive an iCub. "People use a variety of platforms, they rarely publish every detail of their algorithms, and replication of experiments is minimal."

Small has its place

However, not even the most ardent enthusiasts believe that iCub will send all other robots into retirement. "Simpler and more affordable robots will remain important," says Chad Jenkins, a computer scientist at Brown University in Providence, Rhode Island. Jenkins chaired the robotics workshop at the International Joint Conference on Artificial Intelligence in Pasadena, California, at which iCub made his American debut in July. "They may be more limited in the long run, but they are easier to use in small-scale experiments."

“Engineers creating an intelligent robot and neuroscientists studying the brain are asking the same questions. Giulio Sandini ”

Indeed, most of the researchers who have got their hands on an iCub find that programming him is a hard job, and it is likely to take some time before meaningful results can be seen. "Nobody expected anything different," says Verschure, "this is not a car you just buy and start to drive around; we're in totally new ground." Sandini agrees that the robot is still a work in progress, and predicts that it will take two or three years to see an impact in terms of publications in major journals. But, he adds, a better measure of success will be if neuroscientists and psychologists start to see the robots as useful experimental tools. "Engineers creating an intelligent robot and neuroscientists studying the brain are asking the same questions, only with different words," Sandini says. "How can this particular task be performed with limited computational power?"

Convincing neuroscientists may be the hardest part, though, particularly those who question iCub's theoretical background. Although he sees many interesting ideas in iCub, Alfonso Caramazza, director of the Cognitive Neuropsychology Laboratory at Harvard University, says that "the claims being made about shared intentions and language in robots still seem light years away from being realized". In particular, he says, "to account for cooperation and communication you also need symbolic thought, and I do not see how such mechanisms can emerge from mere sensory-motor processes in a robot".

Sandini and Fadiga reply that a complete explanation of higher cognitive functions is a problem for any area of neuroscience, not just robotics. Furthermore, iCub's emphasis on perception and manipulation may one day lead to a better understanding of what 'symbolic thought' really is.

The final word, of course, will come from the robots themselves. Eight iCubs have left Genova since late last year for laboratories in Europe and Turkey, and ten more are being built. Add the two kept by Sandini's team, and the family will comprise 20 brothers by the end of the year.

Great expectations

Most of the researchers will first try iCub in experiments they were already performing with simpler robots. In Barcelona, for example, Verschure's group plans to see how a computer model of a cerebellum it has been working on for years performs in an iCub. "The cerebellum is a crucial organ for motion, it sets the timing and pace of our movements," Verschure says. "But the timing of actions depends on the body's shape. For example, the speed at which I can move my arm is limited by its length and weight. So it makes no sense to study the cerebellum outside the body."

The Japanese robots HRP-2 (left) and ASIMO are able to do a range of tasks, from making music to walking on uneven surfaces. Credit: K. KASAHARA/AP; TERU IWASAKI/AP

At Imperial College London, Murray Shanahan, professor of cognitive robotics, is teaching his iCub to do very basic motor tasks, such as making circular hand movements, using a 'spiking' neural network. These artificial neurons not only 'fire', but the intensity of their firing changes over time. "It is a more biologically plausible model than typical networks," Shanahan says, "one in which the temporal dynamics of neurons is also modelled." Once validated, the spiking-network concept will be used to simulate more complex tasks.

The IIT's two robots will be used to study the development of goal-oriented behaviour. Researchers will start by teaching them to recognize a specific object such as a hammer among many different objects, then to grasp the hammer by the handle and not by the head, and finally to swing it appropriately while hitting a nail, something it may learn by imitation. But to prove that it has really developed an internal representation of what a hammer is for, the robot will have to be able to use it in cooperation with humans. Fadiga says that in two or three years he hopes to get to the point at which "I pick a nail, I put it on the wall, there is a hammer on a table nearby. The robot sees all this and, without any further input, grasps the hammer and hands it to me".

As happens in every family, the brothers face different expectations. At Britain's University of Plymouth and at the Institute for Cognitive Sciences and Technologies in Rome, researchers led respectively by Angelo Cangelosi and Stefano Nolfi are undertaking what is probably the most ambitious of the iCub projects: studying how children learn language. "We believe that manipulation and communication co-evolve during the first three years" says Cangelosi. "Children learn relations among objects by touching them, and learn to express the same relations with language."

To test this idea, he and his colleagues will treat the robot pretty much like a real child. While showing it a blue cup, for example, they will say 'blue cup', and so on, so that in time iCub can make associations between the sounds he hears and the data coming in through the visuo–motor system. "By the third year of the project, he should be able to use transitive and intransitive verbs in simple sentences, such as 'put red cup on yellow cup'," says Cangelosi.

In the very long term, the project aims to give the robot the ability to communicate with humans using natural language, with basic but appropriate grammar — but with no language rules coded anywhere in its software. Language, the researchers hope, will gradually emerge as an extension of visual and motor abilities, providing a strong proof of principle that the same may have happened in humans during evolution.

But some question whether iCub can live up to such expectations. Oliviero Stock, a senior researcher at the Bruno Kessler Foundation in Trento, Italy, and a leading expert in the application of artificial intelligence to linguistics, says that the 'bottom up' approach that iCub adopts can go only so far when it comes to language. "People in the field are coming to terms with the fact that to explain language, you have to presume some innate abilities for it, for grammar, and syntax in particular," he says. "I doubt that such a system can do more than utter single words without some kind of a priori linguistic skills, call it a language instinct if you want. That's what probably happens in humans too."

The idea of an innate predisposition for language in humans was famously introduced in the 1950s by Noam Chomsky, who wrote of a "universal grammar" that children seem to be primed for. Although hotly debated, this has become the dominant view in linguistics. iCub might now provide a way to put it to a rigorous test.

While grown-ups have been arguing passionately about their robots' talents, the iCub brothers have been having a good time, as would be expected from children during the summertime. In late July the Italian brothers spent two weeks on Italy's Liguria coast, where 37 roboticists from many parts of Europe and the United States had convened for a summer school, arranged by Sandini and his colleagues to give them a chance to try their algorithms on an iCub. In pictures posted on the Internet, the robots can be seen wearing hats, grasping the usual ball, and even assembling LEGO bricks. Their actions are not exactly masterful, Sandini admits. But, ever the encouraging father, he says it is only a matter of time.

Credit: L. NATALE

Additional information

Nicola Nosengo is a freelance science writer based in Rome.

Related links

Related links

Related external links

RobotCub

Rights and permissions

Reprints and Permissions

About this article

Further reading

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.