Author:
Gjoreski Martin,Kiprijanovska Ivana,Stankoski Simon,Mavridou Ifigeneia,Broulidakis M. John,Gjoreski Hristijan,Nduka Charles
Abstract
AbstractUsing a novel wearable surface electromyography (sEMG), we investigated induced affective states by measuring the activation of facial muscles traditionally associated with positive (left/right orbicularis and left/right zygomaticus) and negative expressions (the corrugator muscle). In a sample of 38 participants that watched 25 affective videos in a virtual reality environment, we found that each of the three variables examined—subjective valence, subjective arousal, and objective valence measured via the validated video types (positive, neutral, and negative)—sEMG amplitude varied significantly depending on video content. sEMG aptitude from “positive muscles” increased when participants were exposed to positively valenced stimuli compared with stimuli that was negatively valenced. In contrast, activation of “negative muscles” was elevated following exposure to negatively valenced stimuli compared with positively valenced stimuli. High arousal videos increased muscle activations compared to low arousal videos in all the measured muscles except the corrugator muscle. In line with previous research, the relationship between sEMG amplitude as a function of subjective valence was V-shaped.
Publisher
Springer Science and Business Media LLC
Reference46 articles.
1. European Commission. Health at a Glance: Europe 2018 State of Health in the EU Cycle. Online, https://ec.europa.eu/health/system/files/2020-02/2018_healthatglance_rep_en_0.pdf. Accessed 25 May 2022.
2. European Commission. Health at a Glance: Europe 2020 State of Health in the EU Cycle. Online, https://ec.europa.eu/health/system/files/2020-12/2020_healthatglance_rep_en_0.pdf. Accessed 25 May 2022.
3. Russell, J. A. A circumplex model of affect. J. Pers. Soc. Psychol. 39, 1161–1178 (1980).
4. Myers, D. G. Theories of Emotion in Psychology: Seventh Edition, (2004).
5. El Ayadi, M., Kamel, M. S. & Karray, F. Survey on speech emotion recognition: Features, classification schemes and databases. Pattern Recogn. 44(3), 572–587 (2011).
Cited by
20 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献