Tuesday 9 December 2008

AI 161 Smart responsive facial expressions for avatars in 3d web and virtual worlds has been developed


The University of the Balearic Islands (UIB) has developed a computer application that enables the generation of faces that includes emotions and moods.

The study has been published in the latest edition of the magazine Computer Animation and Virtual Worlds.

“The aim of this work has been to design a model that reveals a person's moods and displays them on a virtual face”, SINC was informed by one of the authors of the study, Diana Arellano, from the UIB’s Computer and Artificial Intelligence Graphics and Vision Unit. “In the same 3-D space we have integrated personality, emotions and moods, which had previously been dealt with separately”, Arellano explained to SINC.

The designers have followed the theories of Albert Mehrabian to draw up the model, based on the five personality traits established by this American psychologist:

• Extraversion

• Neuroticism

• Openness

• Conscientiousness

• Agreeableness


“Every personality can be considered an emotional state by default”, indicated Arellano.

An introverted and neurotic personality is therefore related to an anxious emotional state.

The points of the face that define these emotions can be determined mathematically, and the algorithms developed by computer experts can be used to obtain different facial expressions “quickly and easily”.

The system, which uses the MPEG-4 video coding standard for creating images, makes it possible to display basic emotions (anger, disgust, fear, joy, sadness, surprise) and intermediate situations.

“Our next step is to leave the MPEG-4 standard aside and concentrate on a high-quality generic network which will enable the inclusion of both wrinkles and eye, eyelid and head movements, as well as synthesize the voice”, the researcher concluded.