Learning visual appearance for flight control
Flying insects show impressive skills in navigation and piloting, including landing and avoiding obstacles, which roboticists try to mimic in the design of lightweight flying robots. The visual cue of optical flow is known to play a major role in insect navigation and is increasingly studied for use by small flying robots as well. However, there are gaps in the current understanding of optical flow control, as it cannot disentangle distance from velocity, and is less informative in the forward flight direction. In this issue, De Croon et al. propose a solution that consists of a learning process in which the robot first uses optical flow and self-induced oscillations to perceive distances to objects in its environment. It then learns a mapping from visual appearance to these distances to complement optical flow, solving the above-mentioned problems. The approach, which is biologically plausible in terms of processing, sensing, and actuation requirements, is demonstrated on a flying robot.
See De Croon et al.