A feel for the future

Academic collaboration hopes three-year project will lead to development of more human-like robots

Research at the

Bristol Robotics Laboratory

(BRL) could lead to robots with more human-like motion that are safe for people to closely interact with.



The BRL team plans to 'soften' the motion of robots used in everyday tasks that bring them into contact with humans, making them more like us and less like their industrial cousins on the production line.



According to BRL director Prof Chris Melhuish, the laboratory is working on a number of key themes. 'There's a certain paradox if we manufacture robots for service robotics — if a robot is powerful enough to be useful, it is also powerful enough to be dangerous. There are physical and behavioural safety factors, and underlying it all is control,' said Melhuish.



The BRL is a research partnership between the West of England and Bristol universities and the Higher Education Funding Council for England (HEFCE). The robot mechanism being used in the research was designed and built by Bath-based

Elumotion

.





Haptic sensors



Dr Guido Herrmann is leading research into humanoid control approaches for robots, using the BRL's hand-arm assembly, which is an upper torso and two arms that can move fully at every joint with fully articulated arms. Feedback will come from haptic sensors.



'There are two levels to the control project,' said Herrmann. 'In the first, the robot operates like a machine in a production line. You can tell a robot to move a cup from a height of 0m to 0.4m, for instance. The second, operating on top of the machine motion, is a controller to create movement that is very human-like.



'The best way for a human to move is to minimise the muscle effort, and that's what we want to implement in robots. So we must measure that in humans and model it for our robots. At Bristol, there is a lecturer who investigates humans and animals using sensors to measure movement. We want to take ideas from his data and convert it into our movement algorithm.'



Although the BRL is interested in all aspects of robotics, as far as Human Robotic Interface (HRI) is concerned, implementing the upper torso and head is the most important goal. 'If you read a robot a sad story, it should identify a representation of you being sad and react accordingly,'said Melhuish.



'If two humans are carrying out a task together, for example making a cup of coffee where one takes the cup and the other pours, it's simple enough to perceive, but the actions and interactions are difficult to implement in a robot — there's a huge amount of effort.'



One application of the humanoid HRI is in service robotics, such as care and companionship, medical therapy and rehabilitation applications. The torso could even be programmed to signal to deaf people.



According to Herrmann, the distributed controller system gives the researchers a number of challenges. 'The Controller Area Network (CAN) means there is a limited amount of data for every actuator and sensor. We want to control all the data, but it's only available in small packets.'





Distributed intelligence



Starting with one arm, the researchers aim to model the dynamics of the robot and then design a controller for that which does the machine work quickly. They will then incorporate humanoid behaviour before adding two interacting CAN buses for both arms. Then they will look at incorporating more distributed sensors.



Herrmann's approach to the controller system is key to moving robotic plant around in a way that is familiar to people. As the controls are limited by how much information can be carried down the wires, there needs to be local decision-making. His method uses distributed intelligence and levels of abstraction to overcome the practical limitations.



The aim of the three-year study is to implement controllers for both arms with some sensoring in the hands and to satisfy safety issues.



'Rather than sit 500m back from the robot and see it act like a human, we want to interact with it,' said Melhuish, 'Our robots need to be set in context; we don't want display devices. If one person were passing another something, you'd both move — there is compliance. If you show a child how to cut bread you take their hand teaching them to hold and move the knife safely. We want to show robots how to work.'