The robot was taught to react to human emotions: now he smiles back

Researchers at Columbia Engineering have used AI to train robots to respond correctly to human facial expressions – an ability that can build trust between humans and their robotic counterparts. This is stated on the project website.

While facial expressions play a huge role in building confidence, most robots still have an empty, static look. With the increasing use of robots in places where robots and humans must work closely together, from nursing homes to warehouses and factories, the need for a more responsive, realistic robot is becoming more pressing.

Researchers at the Creative Machines Lab at Columbia Engineering have worked for five years to create EVA, a new autonomous robot with a soft and expressive face that responds to match the expressions of nearby people.

“The idea for EVA took shape a few years ago when my students and I began to notice that robots in our lab were looking at us through plastic eyes,” recalls Hod Lipson, professor of innovation.

Lipson noticed a similar trend at the grocery store, where he came across replenishment robots with name badges and, in one case, wearing a cozy hand-knitted cap. “People seemed to humanize their fellow robots by giving them eyes, a personality or a name,” says the scientist. “It got us thinking, if eyes and clothes work, why not create a robot with a super-expressive and responsive human face?”

The first phase of the project began in Lipson’s lab a few years ago, when undergraduate student Zanwar Faraj led a team to create the robot’s physical mechanism. They designed the EVA to be a disembodied bust, much like the silent but animated Blue Man performers. EVA can express six basic emotions: anger, disgust, fear, joy, sadness and surprise, as well as many more subtle emotions using artificial “muscles” that target specific points on the face, mimicking the movement of more than 42 tiny muscles attached to different points to the skin and bones of human faces.

“The biggest challenge with EVA was designing a system that was compact enough to fit within a human skull, yet functional enough to reproduce a wide range of facial expressions,” Farage said.

Once the team was satisfied with the EVA mechanics, they embarked on the second main phase of the project: programming the artificial intelligence that would control the EVA’s facial movements. While lifelike animatronic robots have been used in theme parks and movie studios for years, Lipson’s team has made two technological advances. EVA uses deep learning artificial intelligence to “read” and then display the expressions it sees on nearby people. And EVA’s ability to mimic a wide variety of different human facial expressions is learned through trial and error by watching videos of itself.

If you have found a spelling error, please, notify us by selecting that text and pressing Ctrl+Enter.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Author: John Kessler
Graduated From the Massachusetts Institute of Technology. Previously, worked in various little-known media. Currently is an expert, editor and developer of Free News.
Function: Director
John Kessler

Spelling error report

The following text will be sent to our editors:

31 number 0.225621 time