Not a member yet? Register for full benefits!

Username
Password
Judging Emotions with Age

Emotional states can be a difficult thing to judge. We already know that different cultures see facial expressions differently, but what about different ages? It is both a social problem and a health one, with humanoid robots and AI controlled avatars taking over an increasing number of home care applications.

Scientists at Georgia Tech decided to test our ability to interpret a robot's "emotion" by reading its expression to see if there were any differences between the ages. They found that older adults showed some unexpected differences in the way they read a robot's face from the way younger adults performed.

"Home-based assistive robots have the potential to help older adults age in place. They have the potential to keep older adults independent longer, reduce healthcare needs and provide everyday assistance," said Jenay Beer, graduate student in Georgia Tech's School of Psychology.


Graduate student Jenay Beer (left) speaks with an iCat robot (right)

Beer, along with Wendy Rogers and Arthur Fisk, professors of Engineering Psychology at Georgia Tech and directors of the Human Factors and Aging Laboratory, used an avatar to test their theories. A virtual version of the iCat robot, with limited facial expression control - just like the actual iCat. There was enough expression for teenagers to tell the expressions apart, but what about senior citizens?

The two test groups were adults between the ages of 65 and 75 and 18 to 27. They had the virtual iCat exhibit seven emotions at various levels of intensity: happiness, sadness, anger, fear, surprise, disgust and neutral. They tested how well each participant could read the emotions of the virtual iCat.

Existing research on how well adults can recognize emotions on human faces has found that older adults are less accurate in recognizing anger, fear and sadness. But the robotic study found that older adults were less accurate in recognizing anger and fear, as expected, but had difficulty recognizing happiness, not sadness. In fact, they most often confused the happy expression with the neutral expression of the robot.

Further work is necessary for more humanoid avatars, but it appears that there is a definite gap in the generations in discerning emotional states on artificial faces.

Beer is continuing her work by studying whether other virtual versions of robots show the same differences when compared to the virtual iCat and the human face. What seems clear already, though, is if robots are going to be accepted by older adults in any social situations, they need to be designed with emotion displays that are easy to recognize, with some of them potentially being exaggerated to overcome any trouble older adults may have in reading that emotion among human faces.

References

Is My Robot Happy To See Me?

Staff Comments

 


.
Untitled Document .