Humanoid Robots Display Realistic Emotions, Thanks to New Research

Researchers have developed a method to generate and map realistic facial expressions for humanoid robots

Ben Wodecki, Junior Editor - AI Business

August 5, 2024

2 Min Read
STR/AFP via Getty Images

Chinese scientists have developed a way to make humanoid robots express emotions more naturally and accurately.

Researchers from Hohai University and Changzhou University created a two-stage method to help robots make more natural and complex facial expressions to improve how humans interact with and relate to humanoid robots.

The new AI system can generate detailed examples of facial expressions. A specially designed robot with multiple degrees of freedom for facial movements then learns to perform those expressions.

The method enabled humanoid robots to successfully perform specific facial expressions when instructed.

It was presented at the annual meeting of the China Association of Science and Technology and the research was published in the journal IEEE Transactions on Robotics.

Humanoid robots are currently limited in how they can display emotions, often having few motors in their faces, compared to the numerous muscles in a human face, preventing them from displaying authentic expressions.

The Chinese researchers, along with scientists from the University of Manchester and the  University of Leicester in the U.K., sought to overcome the limitations in humanoid expressions through Action Units (AUs).

AUs are a way to define individual muscle movements when performing a facial expression as defined under the Facial Action Coding System (FACS).

Related:Nvidia Unveils AI Tools to Accelerate Humanoid Robot Development

The researchers developed an AI system capable of generating detailed robot facial expression images guided by AUs. The images were then translated to motor commands for the robot's face.

The system uses the physical limitations of the robot's motors as constraints to refine and make the expressions more realistic, dividing the nine motors into 17 AUs to enable richer expressions and smoother transitions through coordinated movements. The process ensures that the robot can physically reproduce the expressions from the images generated in the first stage.

The new two-stage process better prepares humanoid robots for a wide range of potential applications where displaying emotions is vital, such as in nursing homes, kindergartens and special education schools.

“The humanoid robots will not only assist or replace humans in completing some tasks but also bring more emotional value,” Liu Xiaofeng, a professor at Hohai University and lead author of the research, told Xinhua.

About the Author

Ben Wodecki

Junior Editor - AI Business

Ben Wodecki is the junior editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to junior editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others.

Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!

You May Also Like