Human Centered Interfaces
Computers in these days try to interpret certain human characteristics so as to react better. These characteristics include facial expressions, eyes gaze, body gait, speech etc. . Many applications such as virtual reality, videoconferencing, user profiling, customer satisfaction studies for broadcast and web services and interfaces for people with special needs require efficient facial expression recognition in order to achieve the desired results.
The basic facial expressions are defined as six anger, disgust, fear, happiness, sadness and surprise. A set of muscle movements (Facial Action Units-FAUs) was created to produce those facial expressions, forming the Facial Action Coding System ( FACS ).
Facial expressions are generally hard to recognize as:
A novel method that performs facial expression recognition using fusion of texture and shape information has been developed:
By introducing fusion, certain confusions are resolved. For example, in cases where the gaze has changed, the geometrical information will not be important, while the texture information will be able to capture the change more properly.
The accuracy achieved is equal to 94.5% when achieving facial expression recognition and to 92.1% when achieving FAUs detection.
I. Kotsia and I. Pitas, "Facial expression recognition using shape and texture information", in Proc. of Int. Federation for Information Processing Conf. on Artificial Intelligence (IFIPAI 2006), Santiago, Chile, 21-24 August, 2006.
I. Kotsia, N.Nikolaidis and I. Pitas, "Fusion of Geometrical and Texture information for facial expression recognition", in Proc. of Int. Conf. on Image Processing (ICIP 2006), Atlanta, GA, USA, 8-11 October, 2006.
SIMILAR - The European research taskforce creating human-machine interfaces SIMILAR to human-human communication, IST, FP6