EEG–fNIRS-Based Emotion Recognition Using Graph Convolution and Capsule Attention Network
-
Published:2024-08-16
Issue:8
Volume:14
Page:820
-
ISSN:2076-3425
-
Container-title:Brain Sciences
-
language:en
-
Short-container-title:Brain Sciences
Author:
Chen Guijun1ORCID, Liu Yue1, Zhang Xueying1
Affiliation:
1. College of Electronic Information and Optical Engineering, Taiyuan University of Technology, Taiyuan 030024, China
Abstract
Electroencephalogram (EEG) and functional near-infrared spectroscopy (fNIRS) can objectively reflect a person’s emotional state and have been widely studied in emotion recognition. However, the effective feature fusion and discriminative feature learning from EEG–fNIRS data is challenging. In order to improve the accuracy of emotion recognition, a graph convolution and capsule attention network model (GCN-CA-CapsNet) is proposed. Firstly, EEG–fNIRS signals are collected from 50 subjects induced by emotional video clips. And then, the features of the EEG and fNIRS are extracted; the EEG–fNIRS features are fused to generate higher-quality primary capsules by graph convolution with the Pearson correlation adjacency matrix. Finally, the capsule attention module is introduced to assign different weights to the primary capsules, and higher-quality primary capsules are selected to generate better classification capsules in the dynamic routing mechanism. We validate the efficacy of the proposed method on our emotional EEG–fNIRS dataset with an ablation study. Extensive experiments demonstrate that the proposed GCN-CA-CapsNet method achieves a more satisfactory performance against the state-of-the-art methods, and the average accuracy can increase by 3–11%.
Funder
National Natural Science Foundation of China Fundamental Research Program of Shanxi Province, China Research Project of Shanxi Scholarship Council, China
Reference39 articles.
1. The many meanings/aspects of emotion: Definitions, functions, activation, and regulation;Izard;Emot. Rev.,2010 2. He, Z., Li, Z., Yang, F., Wang, L., Li, J., Zhou, C., and Pan, J. (2020). Advances in multimodal emotion recognition based on brain–computer interfaces. Brain Sci., 10. 3. Subcortical and cortical brain activity during the feeling of self-generated emotions;Damasio;Nat. Neurosci.,2000 4. Liu, Z., Shore, J., Wang, M., Yuan, F., Buss, A., and Zhao, X. (2021). A systematic review on hybrid EEG/fNIRS in brain-computer interface. Biomed. Signal Process. Control, 68. 5. Qiu, L., Zhong, Y., Xie, Q., He, Z., Wang, X., Chen, Y., Zhan, C.A.A., and Pan, J. (2022). Multi-modal integration of EEG-fNIRS for characterization of brain activity evoked by preferred music. Front. Neurorobotics, 16.
|
|