Forcing a smile
A realistic robot developed at the University of California, San Diego has learned to smile and make facial expressions through a process of self-guided learning.
A robot developed at the University of California, San Diego has learned to smile and make facial expressions through a process of self-guided learning.
The UC San Diego researchers used machine learning to empower their robot, which resembles Einstein, to learn to make the realistic facial expressions.
The faces of robots are increasingly realistic and the number of artificial muscles that controls them is rising. In light of this trend, UC San Diego researchers from the Machine Perception Laboratory are studying the face and head of their robotic Einstein to find ways to automate the process of teaching robots to make lifelike facial expressions.
This Einstein robot head has about 30 facial muscles, each moved by a tiny servo motor connected to the muscle by a string. Today, a highly trained person must manually set up these kinds of realistic robots so that the servos pull in the right combinations to make specific face expressions. To begin to automate this process, the UCSD researchers looked to both developmental psychology and machine learning.
Register now to continue reading
Thanks for visiting The Engineer. You’ve now reached your monthly limit of news stories. Register for free to unlock unlimited access to all of our news coverage, as well as premium content including opinion, in-depth features and special reports.
Benefits of registering
-
In-depth insights and coverage of key emerging trends
-
Unrestricted access to special reports throughout the year
-
Daily technology news delivered straight to your inbox
Experts speculate over cause of Iberian power outages
The EU and UK will be moving towards using Grid Forming inverters with Energy Storage that has an inherent ability to act as a source of Infinite...