UK researchers are helping create robots that recognise emotions, in order to encourage people to accept them as part of everyday life.
Scientists from Queen Mary, Hertfordshire and Heriot-Watt universities are part of the EU-funded LIREC project to develop software that enables robots to respond to human emotions and change their behaviour accordingly.
They hope that by understanding how humans recognise emotions, they will be able to create mathematical models that robots can use to read social situations. This could then allow robots to be used more commonly in homes, schools and offices.
‘It’s a question of finding out how biology does it and trying to build systems that do the same,’ Queen Mary’s Prof Peter McOwan told The Engineer at this week’s Royal Society Summer Science Exhibition in London.
‘We want to have a better understanding of how human brains process information about faces and build those ideas into the next generation of socially aware robotic companions.’
Part of this process involves developing software that can recognise movement in people’s faces in order to understand their emotions.
The work also involves running psychological experiments on people, asking them to make judgements about pictures of faces and using the responses to understand how the brain processes facial information.
For example, the team found that humans find it hard to see unusual features or changes in a picture of a face if it is upside down.
They also found people were able to recognise their friends from the way their faces moved while talking, using a computer-generated character that mimicked their real movements.
The first use of the technology is likely to be in education and a team from the INESC-ID institute in Portugal is creating a robotic cat to teach children how to play chess.
‘You find people are far more likely to engage themselves in a learning experience if they feel they are emotionally involved in it and having those physical embodiments in the robots being able to express emotion back to them,’ said McOwan.
The project is also studying how robots can be integrated into offices and homes, particularly for use by elderly or disabled people, for example to remind them to take medicine.
A team from the Wroclaw University of Technology in Poland has developed a companion robot named Emys that displays emotions using a head comprised of three movable discs and two ball-shaped eyes.
‘For you to get on well with a robot in close proximity with you it’s got to understand something about the social interactions that we have,’ said McOwan.
‘For example, knowing how close they should come to you without making you uncomfortable or recognising that if you’re looking annoyed it should back off.
‘In an office, emotions can run high and it’s always useful to know who’s in a foul mood and who to avoid.’
The project is due to finish next year, by which time the team hopes to have tested robots in all three environments and developed a set of ethical guidelines that express how the technology can be used appropriately.