This story is from the category Artificial Intelligence
Date posted: 30/05/2008
By combining audio and visual data, Yongjin Wang from the University of Toronto and Ling Guan from Ryerson University in Toronto have developed a system that recognizes six human emotional states: happiness, sadness, anger, fear, surprise, and disgust.
Their system can recognize emotions in people from different cultures and who speak different languages with a success rate of 82%.
?Human-centered computing focuses on understanding humans, including recognition of face, emotions, gestures, speech, body movements, etc.,? Wang stated. ?Emotion recognition systems help the computer to understand the affective state of the user, and hence the computer can respond accordingly based on that perception.?
Some vocal and facial features may play an important role in characterizing certain emotions, but a very minimal role in other emotions. As a general example, happiness is detected better using certain visual features (e.g. in smiling), while anger is detected better using audio features (e.g. in yelling).
?The most difficult part of enabling a computer to detect human emotion is the vast variance and diversity of vocal and facial expressions due to factors such as language, culture, and individual personality, etc.,? Wang explained. ?Also, as shown in our paper, there are no sharp boundaries between different emotions. The accurate identification of discriminate patterns is a challenging problem.?
Ultimately an AI that can recognise emotional states would be ideal for high-pressure selling, and in reception-bots for augmented reality co mpany interactions - attempting to soothe and calm you before a meeting for example.
See the full Story via external site: physorg.com
Most recent stories in this category (Artificial Intelligence):
03/03/2017: Application of Fuzzy Logic Teaches Drones to land on Moving Targets