Affiliation:
1. Unitat de Gràfics i Visió per Ordinador i IA, Department of Mathematics and Computer Science, University of the Balearic Islands, 07122 Palma, Spain
Abstract
This paper proposes a model-based method for real-time automatic mood estimation in video sequences. The approach is customized by learning the person’s specific facial parameters, which are transformed into facial Action Units (AUs). A model mapping for mood representation is used to describe moods in terms of the PAD space: Pleasure, Arousal, and Dominance. From the intersection of these dimensions, eight octants represent fundamental mood categories. In the experimental evaluation, a stimulus video randomly selected from a set prepared to elicit different moods was played to participants, while the participant’s facial expressions were recorded. From the experiment, Dominance is the dimension least impacted by facial expression, and this dimension could be eliminated from mood categorization. Then, four categories corresponding to the quadrants of the Pleasure–Arousal (PA) plane, “Exalted”, “Calm”, “Anxious” and “Bored”, were defined, with two more categories for the “Positive” and “Negative” signs of the Pleasure (P) dimension. Results showed a 73% of coincidence in the PA categorization and a 94% in the P dimension, demonstrating that facial expressions can be used to estimate moods, within these defined categories, and provide cues for assessing users’ subjective states in real-world applications.
Reference36 articles.
1. Emotional expression in psychiatric conditions: New technology for clinicians;Grabowski;Psychiatry Clin. Neurosci.,2019
2. Application of facial expression studies on the field of marketing;Barreto;Emotional Expression: The Brain and the Face,2017
3. Automatic Analysis of Facial Affect: A Survey of Registration, Representation, and Recognition;Sariyanidi;IEEE Trans. Pattern Anal. Mach. Intell.,2015
4. Ekman, P., and Friesen, W. (1978). Facial Action Coding System: Manual, Consulting Psychologists Press.
5. Friesen, W.V., and Ekman, P. (1983). EMFACS-7: Emotional Facial Action Coding System. Psychol. Comput. Sci.