Analysis of human expressions could teach androids to smile
New research led by Osaka University, Japan, has examined the mechanisms of human facial expressions to understand how robots can more effectively convey and recognise our emotions.

A robot’s ability to understand and display human emotion has long been a trope of science fiction stories, but new multi-institutional research has begun mapping the intricacies of human facial movements to bring this idea closer to reality.
Researchers used 125 tracking markers attached to a person’s face to closely examine 44 different singular facial actions, from blinking to raising the corner of the mouth.
Information gathered by this study could help researchers develop and improve artificial faces, both digitally on screens and, ultimately, physically for the faces of android robots. The study aimed to understand tensions and compressions in human facial structure, with the hope to allow these artificial expressions to appear more accurate and natural.
The researchers said this work could have applications beyond robotics, for improved facial recognition or medical diagnoses, for example, the latter of which currently relies on a doctor’s intuition to notice abnormalities in facial movement.
Register now to continue reading
Thanks for visiting The Engineer. You’ve now reached your monthly limit of news stories. Register for free to unlock unlimited access to all of our news coverage, as well as premium content including opinion, in-depth features and special reports.
Benefits of registering
-
In-depth insights and coverage of key emerging trends
-
Unrestricted access to special reports throughout the year
-
Daily technology news delivered straight to your inbox
Experts speculate over cause of Iberian power outages
The EU and UK will be moving towards using Grid Forming inverters with Energy Storage that has an inherent ability to act as a source of Infinite...