Determining the Optimal Window Duration to Enhance Emotion Recognition Based on Galvanic Skin Response and Photoplethysmography Signals
-
Published:2024-08-22
Issue:16
Volume:13
Page:3333
-
ISSN:2079-9292
-
Container-title:Electronics
-
language:en
-
Short-container-title:Electronics
Author:
Bamonte Marcos F.1ORCID, Risk Marcelo2ORCID, Herrero Victor1ORCID
Affiliation:
1. Laboratorio de Investigación, Desarrollo y Transferencia de la Universidad Austral (LIDTUA), Facultad de Ingeniería, Universidad Austral, Mariano Acosta 1611, Pilar B1629WWA, Argentina 2. Instituto de Medicina Traslacional e Ingeniería Biomédica (IMTIB), CONICET-IUHI-HIBA, Potosí 4240, Buenos Aires C1199ACL, Argentina
Abstract
Automatic emotion recognition using portable sensors is gaining attention due to its potential use in real-life scenarios. Existing studies have not explored Galvanic Skin Response and Photoplethysmography sensors exclusively for emotion recognition using nonlinear features with machine learning (ML) classifiers such as Random Forest, Support Vector Machine, Gradient Boosting Machine, K-Nearest Neighbor, and Decision Tree. In this study, we proposed a genuine window sensitivity analysis on a continuous annotation dataset to determine the window duration and percentage of overlap that optimize the classification performance using ML algorithms and nonlinear features, namely, Lyapunov Exponent, Approximate Entropy, and Poincaré indices. We found an optimum window duration of 3 s with 50% overlap and achieved accuracies of 0.75 and 0.74 for both arousal and valence, respectively. In addition, we proposed a Strong Labeling Scheme that kept only the extreme values of the labels, which raised the accuracy score to 0.94 for arousal. Under certain conditions mentioned, traditional ML models offer a good compromise between performance and low computational cost. Our results suggest that well-known ML algorithms can still contribute to the field of emotion recognition, provided that window duration, overlap percentage, and nonlinear features are carefully selected.
Funder
Universidad Austral
Reference42 articles.
1. Emotion recognition and artificial intelligence: A systematic review (2014–2023) and research recommendations;Khare;Inf. Fusion,2023 2. Dzedzickis, A., Kaklauskas, A., and Bucinskas, V. (2020). Human emotion recognition: Review of sensors and methods. Sensors, 20. 3. A Review, Current Challenges, and Future Possibilities on Emotion Recognition Using Machine Learning and Physiological Signals;Bota;IEEE Access,2019 4. Schmidt, P., Reiss, A., Dürichen, R., and Laerhoven, K.V. (2019). Wearable-Based Affect Recognition—A Review. Sensors, 19. 5. Davoli, L., Martalò, M., Cilfone, A., Belli, L., Ferrari, G., Presta, R., Montanari, R., Mengoni, M., Giraldi, L., and Amparore, E.G. (2020). On driver behavior recognition for increased safety: A roadmap. Safety, 6.
|
|