An elegant combination of electronics and elastic materials has been used to construct a small visual sensor that closely resembles an insect's eye. The device paves the way for autonomous navigation of tiny aerial vehicles. See Letter p.95
Flies are usually treated with disdain. Most commonly associated with spreading disease, they are at best considered simply annoying. Conversely, and far less appreciated, flies have also inspired mankind for centuries. An early report along these lines dates back to the seventeenth century, when the young René Descartes, while lying sick in bed, observed a fly walking along the ceiling of his room. Thinking about how he could describe the path of the fly in quantitative terms, he came up with what has become known as Cartesian coordinates, which allow algebra to be applied to geometry, and the importance of which can hardly be overestimated. The most recent example of such insect-inspired research is described by Rogers and colleagues (Song et al.1) on page 95 of this issue — the authors have transferred the design of an insect's compound eye to a digital camera.
In almost all cameras used today, the light reflected from an object in the environment is collected by a single lens and projected onto a layer of light-sensitive material in such a way that a sharp image is formed. Our eyes, as well as all other vertebrate eyes, also use this principle of image formation. The concept has the clear advantage of optimal usage of photons, guaranteeing maximum light sensitivity. Furthermore, it provides high spatial resolution, which is limited only by the density of photoreceptors in the focal plane of the lens.
Nevertheless, most living organisms use compound, or faceted, eyes instead of lensed eyes to see the world. Faceted eyes have very different optics and are composed of many hundreds or thousands of optical units (facets)2. In the case of the 'apposition' eye of daylight insects, each facet is optically isolated from its neighbour and equipped with its own lens and set of photoreceptors. Because each facet accepts photons from only a small angle in space, the light sensitivity of apposition eyes is rather low and the spatial resolution is limited by the number of facets that can be packed on to the small head of the insect. However, apposition eyes provide their bearer with a panoramic view of the world as well as with an infinite depth of field, without the need to adjust the focal length of the individual lenses.
Song and colleagues now report the successful engineering of a digital camera that mimics the insect apposition eye in almost every aspect (Fig. 1). To achieve this end, the authors combined an array of elastic microlenses and a deformable array of photodetectors into a two-layer design, and transformed both layers from a planar geometry into a hemispheric shape (see Fig. 1 of the paper1).
The key to the success of this procedure lies in maintaining correct alignment between both sheets so as not to introduce unwanted optical artefacts. Song et al. attained this by rigidly joining the two layers only at the precise locations where the microlenses overlie the photodetectors, while permitting the layers to deform independently elsewhere. The use of a turret-like, domed structure for each microlens effectively decoupled the microlenses from the mechanical stress caused by bending. Furthermore, the authors used deformable, serpentine conductor wires as a flexible electrical interconnect between photodetectors. The result is a small, artificial faceted eye with a near-hemispheric field of view, without off-axis aberration and with an almost infinite depth of field.
Given their almost complete coverage of visual space, faceted eyes are ideal for calculating the apparent motion of an object generated by its motion relative to the observer (optical flow)3. With regard to potential applications, the camera proposed by Song et al. might constitute an optimal front-end visual sensor for tiny aircraft called micro aerial vehicles (MAVs)4. Although, so far, most cameras on board MAVs simply use fisheye lenses to produce a wide-angle field of view5, Song and colleagues' camera would provide all the advantages of an apposition eye. Using it to compute a MAV's self-motion could on the one hand facilitate motion stabilization in space while on the other enabling spatial navigation6.
As with any development, there is always room for improvement. The camera's low light sensitivity, which is inherent in apposition eyes, could be ameliorated by placing more than one photodetector beneath each microlens and combining the output of photodetectors in neighbouring facets looking at the same point in space. In fact, flies use this principle of 'neural superposition' to increase the amount of light detected by the eye by a factor of seven, thereby achieving significantly higher light sensitivities7.
Such resolvable issues aside, the system proposed by the authors could prove a stepping stone towards autonomous navigation of MAVs in their manifold possible uses. One major application is disaster relief. Picture the following: a palm-sized MAV uses an artificial faceted eye to navigate autonomously through a collapsed building while other sensors on board scan the environment for smoke, radioactivity or even people trapped beneath rubble and debris. Although these MAVs do not exist yet, thanks to devices such as that reported by Song et al., they should come within reach in the foreseeable future.
Song, Y. M. et al. Nature 497, 95–99 (2013).
Land, M. F. & Fernald, R. D. Annu. Rev. Neurosci. 15, 1–29 (1992).
Koenderinck, J. J. & van Doorn, A. J. Biol. Cybern. 56, 247–254 (1987).
Floreano, D., Zufferey, J.-C., Srinivasan, M. V. & Ellington, C. (eds) Flying Insects and Robots (Springer, 2010).
Plett, J., Bahl, A., Buss, M., Kühnlenz, K. & Borst, A. Biol. Cybern. 106, 51–63 (2012).
Srinivasan, M. V. & Zhang, S. Annu. Rev. Neurosci. 27, 679–696 (2004).
Kirschfeld, K. Exp. Brain Res. 3, 248–270 (1967).
About this article
Bioinspiration & Biomimetics (2019)
Applied Optics (2018)
Optics Communications (2018)
Compound eye and retina-like combination sensor with a large field of view based on a space-variant curved micro lens array
Applied Optics (2017)