Temporal Convolutional Network-Enhanced Real-Time Implicit Emotion Recognition with an Innovative Wearable fNIRS-EEG Dual-Modal System
-
Published:2024-03-31
Issue:7
Volume:13
Page:1310
-
ISSN:2079-9292
-
Container-title:Electronics
-
language:en
-
Short-container-title:Electronics
Author:
Chen Jiafa1ORCID, Yu Kaiwei1ORCID, Wang Fei1ORCID, Zhou Zhengxian2, Bi Yifei1, Zhuang Songlin1, Zhang Dawei13
Affiliation:
1. Engineering Research Center of Optical Instrument and System, Ministry of Education and Shanghai Key Lab of Modern Optical System, University of Shanghai for Science and Technology, Shanghai 200093, China 2. Anhui Province Key Laboratory of Optoelectric Materials Science and Technology, Anhui Normal University, Wuhu 241002, China 3. Shanghai Environmental Biosafety Instruments and Equipment Engineering Technology Research Center, University of Shanghai for Science and Technology, Shanghai 200093, China
Abstract
Emotion recognition remains an intricate task at the crossroads of psychology and artificial intelligence, necessitating real-time, accurate discernment of implicit emotional states. Here, we introduce a pioneering wearable dual-modal device, synergizing functional near-infrared spectroscopy (fNIRS) and electroencephalography (EEG) to meet this demand. The first-of-its-kind fNIRS-EEG ensemble exploits a temporal convolutional network (TC-ResNet) that takes 24 fNIRS and 16 EEG channels as input for the extraction and recognition of emotional features. Our system has many advantages including its portability, battery efficiency, wireless capabilities, and scalable architecture. It offers a real-time visual interface for the observation of cerebral electrical and hemodynamic changes, tailored for a variety of real-world scenarios. Our approach is a comprehensive emotional detection strategy, with new designs in system architecture and deployment and improvement in signal processing and interpretation. We examine the interplay of emotions and physiological responses to elucidate the cognitive processes of emotion regulation. An extensive evaluation of 30 subjects under four emotion induction protocols demonstrates our bimodal system’s excellence in detecting emotions, with an impressive classification accuracy of 99.81% and its ability to reveal the interconnection between fNIRS and EEG signals. Compared with the latest unimodal identification methods, our bimodal approach shows significant accuracy gains of 0.24% for EEG and 8.37% for fNIRS. Moreover, our proposed TC-ResNet-driven temporal convolutional fusion technique outperforms conventional EEG-fNIRS fusion methods, improving the recognition accuracy from 0.7% to 32.98%. This research presents a groundbreaking advancement in affective computing that combines biological engineering and artificial intelligence. Our integrated solution facilitates nuanced and responsive affective intelligence in practical applications, with far-reaching impacts on personalized healthcare, education, and human–computer interaction paradigms.
Funder
National Natural Science Foundation of China
Reference70 articles.
1. Jiang, X., Fan, J., Zhu, Z., Wang, Z., Guo, Y., Liu, X., Jia, F., and Dai, C. (2023). Cybersecurity in neural interfaces: Survey and future trends. Comput. Biol. Med., 167. 2. Speech emotion recognition based on an improved brain emotion learning model;Liu;Neurocomputing,2018 3. Liu, H., Cai, H., Lin, Q., Zhang, X., Li, X., and Xiao, H. (2023). FEDA: Fine-grained emotion difference analysis for facial expression recognition. Biomed. Signal Process. Control, 79. 4. Deep Emotional Arousal Network for Multimodal Sentiment Analysis and Emotion Recognition;Zhang;Inf. Fusion,2022 5. Rahman, M.M., Sarkar, A.K., Hossain, M.A., Hossain, M.S., Islam, M.R., Hossain, M.B., Quinn, J.M.W., and Moni, M.A. (2021). Recognition of human emotions using EEG signals: A review. Comput. Biol. Med., 136.
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|