Eye, robot
Computer models of human visual responses will help robots get their priorities right.

Multi-tasking robots capable of stopping what they are doing and responding to events in the corner of their ‘eye’ are being developed in a project involving
.
The project, which began this week, is aimed at developing technologies to create robots with the ability to switch between tasks in the same way as animals and humans.
Researchers at BAE and
These could include devices for going into dangerous, hostile or inaccessible environments, monitoring the environment and human performance for signs of danger or assisting the disabled, aged or infirm. Each of these tasks calls for a robot that can adapt to and learn from unpredictable circumstances, said Dr Kevin Gurney, the project leader and lecturer in computational neuroscience at
Register now to continue reading
Thanks for visiting The Engineer. You’ve now reached your monthly limit of news stories. Register for free to unlock unlimited access to all of our news coverage, as well as premium content including opinion, in-depth features and special reports.
Benefits of registering
-
In-depth insights and coverage of key emerging trends
-
Unrestricted access to special reports throughout the year
-
Daily technology news delivered straight to your inbox
Comment: Engineers must adapt to AI or fall behind
A fascinating piece and nice to see a broad discussion beyond GenAI and the hype bandwagon. AI (all flavours) like many things invented or used by...