Robot with artificial bee brain navigates via external stimuli

Feb. 18, 2014

A team of researchers from Freie Universität Berlin , Bernstein Fokus Neuronal Basis of Learning, and the Bernstein Center Berlin have developed a small vision-enabled robot which perceives environmental stimuli and “learns” to react to them. The robot uses the nervous system of a honeybee as a model for its working principles in the sense that it links certain external stimuli with behavioral rules. In order to "see," the robot is equipped with aCMOS camera and an ATMEGA8 microcontroller that performs image processing. The vision system is connected to a computer, and a program run on the computer replicates the sensorimotor network of the insect brain. >>>Read more

About the Author

James Carroll

Former VSD Editor James Carroll joined the team 2013.  Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!