This story is from the category Conferences
Date posted: 03/06/2009
06 November 2009 - 06 November 2009
MIT Media Lab, Boston, USA
A vital requirement for social robots and virtual agents is the ability to infer the affective and mental states of humans, so as to be able to engage in and behave appropriately during social interactions, for example, to ensure the user is interested in maintaining the interaction or to provide suitable empathic responses. One major component in such a mentalizing capacity, comprised of both cognitive and affective strands, is the interpretation of human behaviour from sensory input, which must always be conducted in a timely manner. Researchers in multimodal interfaces have been increasingly addressing the design of systems endowed with this ability. Nevertheless, only a few attempts have been made towards the development of virtual agents and robots capable of inferring the user.s states in real-time and engaging in meaningful interaction (e.g. towards a shared goal). This workshop will consider real-time techniques for the recognition and interpretation of human verbal and non-verbal behaviour for application in human-agent and human-robot interaction frameworks.
Target audience and expected impact
This workshop follows the successful ICMI.2008 AFFINE workshop (AFFINE - Affective Interaction in Natural Environments: Real-time affect analysis and interpretation for virtual agents and robots). An outcome of this past workshop is a special issue of the Springer Journal on Multimodal User Interfaces (JMUI) to appear in December 2009 (submission deadline April 2009).
This new 2009 workshop seeks to bring together those who work on the real-time interpretation of user behavior to produce mid- or high-level state descriptors, from basic emotions to more complex appraisals or mental states (e.g. agreement and interest, or blends of several emotions) with those who wish to apply this capacity as part of a 'social perception' module or equivalent in social robots and virtual agent interaction frameworks.
We expect the main outcome of this workshop to be the identification of open issues in real-time, affect-aware applications .in the wild. and especially in embodied interaction (robots). Issues such as natural and multimodal interaction, estimation and adaption to context, context dependent processing and related databases, HCI beyond emotion (cognition, behavior, etc.) will be discussed in the framework of interacting with other humans, embodied conversational agents and social robots.
In addition to that, researchers from diverse fields (signal processing and pattern recognition, machine learning, cognition, HCI and robotics) will benefit from mutual osmosis of ideas, concepts and developments in the field.
A possible outcome of the workshop is a journal special issue to appear in 2010 (possibly by the Springer Journal on Multimodal User Interfaces www.jmui.org).
List of Topics
* Real-time recognition of affect (facial expressions, body language, speech, physiological, etc.)
* Multimodal expression of emotion by virtual characters and robots during affective interaction with users
* Perception-action loops in agents/robots
* Cognitive and affective mentalizing / theory of mind
* Visual attention / User engagement
* Theories of mind and emotion
* Emotion and cognitive state representation
* Context awareness and adaptation
* Natural HCI and HRI / multimodal and emotional corpora / emotion induction
* Social robotic platforms
* Evaluation of affective interaction and user-centred design
* Applications: interactive games, empathic interfaces, social robots, etc.
See the full Story via external site: homepages.feis.herts.ac.uk
Most recent stories in this category (Conferences):
03/03/2017: ICME 2017 - Large Scale 3D Human Activity Analysis