The group, which includes engineers and neuroscientists from the University of Adelaide and Lund University, used recordings from the small target motion detector neurons in the brain of a dragonfly to develop a closed-loop target detection and tracking algorithm. To test its performance in real-world conditions, they then implemented the model on a robotic platform that uses active pursuit strategies based on insect behaviour.
The pioneering research, which is published in the Journal of Neural Engineering, is one of the latest efforts to tap into and mimic the incredible abilities of insects. The research also represents the first time that a target-tracking model inspired by insect neurophysiology has been implemented and tested on an autonomous robot.
“Insects, are capable of remarkably complex behavior, yet have a miniature brain consuming tiny amounts of power compared with even the most efficient digital processors,” said Dr Steven Wiederman, who is leading the project in the Visual Physiology & Neurobotics lab of the university of Adelaide.
“Detecting and tracking a moving object against a cluttered background is among the most challenging tasks for both natural and artificial vision systems. We are looking at the actual algorithm the insect brain uses for target tracking as inspiration for robots,” added Lund University biologist Professor David O’Carroll.
According to the team, the robot performed well in its pursuit of targets, despite having to deal with challenging experimental conditions including low contrast targets, heavily cluttered environments and the presence of distracters.