CME 2012 - IEEE CME

picture

ICME CME 2012 Conference
Plenary Talk 6

Biomimetic Visual Sensors and Autopilots: Lessons from Insects

Nicolas Franceschini, Ph.D.
Emeritus Directeur de Recherche CNRS
Institute of Movement Science
Biorobotics Laboratory
CNRS and Aix-Marseille University
F, 163 Avenue du Luminy (CP938), 13288, MARSEILLE (France)
E-mail: nicolas.franceschini@univmed.fr

Abstract:
The insect compound eye is a masterpiece of micro-optics, optronics, neuronics and nanomechatronics. It has already given rise to major applications such as anti-reflection coatings (used on TV screens and solar cells), polarization-keeping optical fibers (used in microscopes), tandem photodetectors (used in target trackers), graded index (GRIN) lenses and GRIN optical fibers (used in medical endoscopes), optical auto-leveling systems and polarization compasses (used on aircraft), X-ray telescopes, etc. The insect compound eye (facet eye) displays crystalline structures at various scales, from the nano to the meso range: honeycomb structure in the facet array, in the corneal nipple array, in the photoreceptor mosaic, in the (visual pigment bearing) microvilli, and in the neural array driven by the photoreceptor cells’ signals. Insects use their two compound eyes for immediate action upon the steering. Despite their coarse visual system (only 102 to 104 pixels) and their minimalist number of neurons (< 106), insects are able to navigate in 3D, avoiding obstacles, fixating and tracking other insects and landing gracefully…From the results of our micro-optical, electrophysiological and behavioral studies on insect vision, we have developped several biomimetic visual sensors and aerial robots. The 100-gram robot OCTAVE, for example, is able to avoid the ground, react to wind and land autonomously. The robot OSCAR is able to track a moving edge with hyperacute resolution (that is, with a resolution far better than the « static resolution » calculated from the interreceptor angle), and suggests how an animal visual system may contribute to stabilize the gaze accurately in order to guide navigation in the presence of disturbances such as gusts of wind.

Each of the 2 panoramic fly’s compound eyes views the world with only 5000 ommatidia (= 5000 pixels). The robot OSCAR-2, equipped with a moving eye, is able to fixate and track a moving target while being robust to random perturbations applied to its body (Kerhuel L. et al. : Steering by gazing : an efficient biomimetic control strategy for visually guided micro aerial vehicles. IEEE Trans. Rob. 26 : 307-319 (2010).


Nicolas Franceschini is emeritus CNRS Research Director at the Institute of Movement Science, Marseille, France, where he created the Neurocybernetics laboratory and the Biorobotics laboratory. He graduated in Electronics and Control theory and received the PhD in Physics from the Polytechnic Institute, Grenoble, France. He then studied neurophysiology and behavioral science at the University of Tübingen, Germany, spending 13 years at the Max-Planck Institute for Biological Cybernetics. He has published about 160 papers in various fields such as physiological optics and micro-optics, phototransduction, photochemistry, neuroanatomy, electrophysiology, microscopy and robotics. He initiated the field of biology-inspired robotics (« Biorobotics ») early in the mid 80’s. He was a visiting scientist at Univ. of Sherbrooke, KEIO University School of Medicine (Tokyo), ANU (Canberra) and ETL (Tsukuba). His current interest is in animal sensors, eye control, head control and flight control systems, with potential biomimetic applications to air and space vehicles. He has received several national and international prizes and is a member of the Academia Europaea.