11 December 2008

Virtual Emotions, Moods, Personality

A team of researchers from the University of the Balearic Islands (UIB) has developed a computer model that enables the generation of faces which for the first time display emotions and moods according to personality traits. The aim of this work has been to design a model that reveals a person's moods and displays them on a virtual face. In the same 3-D space we have integrated personality, emotions and moods, which had previously been dealt with separately. Researchers pointed out that emotions (such as fear, joy or surprise) are almost instantaneous mood alterations, in contrast to emotional states (such as boredom or anxiety) which are more long-lasting, or personality, which normally lasts someone's entire life. The designers have followed theories to draw up the model, based on the five personality traits established by American psychologists including: extraversion, neuroticism, openness, conscientiousness and agreeableness. An introverted and neurotic personality is therefore related to an anxious emotional state. The points of the face that define these emotions can be determined mathematically, and the algorithms developed by computer experts can be used to obtain different facial expressions ‘quickly and easily’.

The system, which uses the MPEG-4 video coding standard for creating images, makes it possible to display basic emotions (anger, disgust, fear, joy, sadness, surprise) and intermediate situations. The results of the method have been assessed objectively (through an automatic recognizer which identified 82% of the expressions generated) and subjectively, through a survey carried out among a group of 75 university students. The students successfully recognised 86% of the emotions and 73% of the emotional states shown on the computer. Even so, the researchers have detected that some emotions, such as fear and surprise, are difficult to tell apart, with context helping to differentiate between the two. The team is already working in this line and prepared a virtual storyteller which enriches the narration, using its face to express the emotions generated by the story told. The researchers believe that this model could be applied in both educational environments (virtual tutors and presenters with personality traits) and in video game characters or interactive stories that have their own emotional motor.

More information:

http://www.sciencedaily.com/releases/2008/12/081204133855.htm