Researchers attempt to emulate human vision in robots
Researchers attempt to emulate human vision in robots
To interact efficiently with unknown environments, robots need to build representations of these environments. In primates, eye movements with respect to the body are used to extract environmental information. Because such controlled movements and focused attention enable efficient information processing and prediction, researchers are trying to emulate these functions into robots by mimicking the fast movements of the human eye.
To do so, researchers at the Microprocessor Systems Laboratory of the Ecole Polytechnique Fédérale de Lausanne (EPFL; Lausanne, Switzerland) have developed a vision-based robot that is being used to study human-based vision. Known as the EPFL Vision Sphere, the system emulates the gaze-targeting functions of the human eye. Says Olivier Carmona, a researcher on the project, "The architecture of the system is similar to that of a computer mouse. In a mouse, a trackball is supported by two encoder wheels, one support wheel, and a mouse pad."
Inside the Vision Sphere, a V-1210 NTSC CCD camera from Marshal Electronics (Culver City, CA) is supported by two motor wheels and two ball bearings. Using a lens with 110° horizontal x 83° vertical fields of view, the camera captures images and delivers them to an Ultra 1 Creator workstation from Sun Microsystems (Palo Alto, CA) equipped with a SunVideo board. Camera positions are controlled via a serial link.
This imaging platform is used in stand-alone mode to develop an embedded image processor for the Koala mobile robot from K-Team (Préverenges, Switzerland). To perform image processing, EPFL researchers have developed a board that acquires images using the PX4072 integrated-circuit digitizer from Cirrus Logic (Fremont, CA). Image preprocessing is accomplished using a Flex 8000 programmable logical array from Altera (San Jose, CA), which is controlled by a CPU board based on a 66-MHz PowerPC 403 GCX microcontroller from IBM Corp. (Fishkill, NY).
To gain human-like capability, a binocular head has been developed using two Vision Spheres embedded in the Koala robot. To control camera elevation and azimuth, the embed- ded processor obtains azimuth data from the corresponding motor shaft encoder. Elevation is calculated from movements of two motor shaft encoders. Motion of the neck of the binocular head is based on a serial pan-tilt mechanism that allows +180° rotation.
Carmona and his colleagues are using the Koala robot and Vision Sphere to emulate human gazing. They are searching for features within images at increasingly finer scales. The team intends to work cooperatively with biologists to test theoretical models of human gazing robotics that mimic the staring process of the human eye.