Team aims for better vision control on robotic ‘walkers’

New research could speed up the development of robotic vehicles that walk on legs by enabling them to see where they’re going.

Scientists at Bristol University have received more than £500,000 from EPSRC to create a video system to capture images of a vehicle’s environment and build a 3D virtual map that would allow the robot to better navigate terrain.

To develop this vision-control architecture, the researchers will first model how humans walk and respond to changing terrain by collecting data with a head-mounted camera and feeding it back to subjects using a virtual-reality headset.

‘The phrase “vision-control architecture” is used as shorthand to refer to the design of a control system that uses electromagnetic radiation in the visible spectrum as sensory input,’ project leader Dr Jeremy Burn, from Bristol’s anatomy department, told The Engineer via email.

The project stems from the idea that leg-based vehicles can cope with changing natural terrain more easily than those on wheels but that current technology isn’t sufficient to allow such vehicles to navigate complex terrain using vision.

‘As we move towards a future of autonomous systems operating beyond the extent of the road network and on other planets, it is likely that development of robust artificial leg-based locomotion will become increasingly important,’ according to the project’s EPSRC grant summary.

The researchers also claim the technology to be developed under the project could improve navigation system in wheeled vehicles and be used for any autonomous visual-guidance system.

‘The project is relevant to any land vehicle technology given that land vehicles generate external force through contact with the ground and that requires knowledge of mechanical properties of the terrain,’ said Burn.

US robotics firm Boston Dynamics claims that its walking robot, BigDog, can climb slopes up to 35°, walk across rubble and climb a muddy hiking trail, using a variety of position and pressure sensors, as well as a gyroscope, LIDAR and a stereo-vision system to navigate.