Sensing visual motion gives a creature valuable information about its interactions with the environment. Flies in particular use visual motion information to navigate through turbulent air, avoid obstacles, and land safely.
Mobile robots are ideal candidates for using this sensory modality to enhance their performance, but so far have been limited by the computational expense of processing video. Also, the complex structure of natural visual scenes poses an algorithmic challenge for extracting useful information in a robust manner.
We address both issues by creating a small, low-power visual sensor with integrated analog parallel processing to extract motion in real-time. Because our architecture is based on biological motion detectors, we gain the advantages of this highly evolved system:
A design that robustly and continuously extracts relevant information from its visual environment. We show that this sensor is suitable for use in the real world, and demonstrate its ability to compensate for an imperfect motor system in the control of an autonomous robot. The sensor attenuates open-loop rotation by a factor of 31 with less than 1 mW power dissipation.
Source: The Pennsylvania State University
Author: Reid R. Harrison | Christof Koch
>> Digital Electronics Project Ideas and Topics for ECE Students