Deep Learning Based on CNN for Emotion Recognition Using EEG Signal
Author:
Ahmad Isah Salim1, Zhang Shuai1, Saminu Sani1, Wang Lingyue1, Isselmou Abd El Kader1, Cai Ziliang1, Javaid Imran1, Kamhi Souha1, Kulsum Ummay1
Affiliation:
1. State Key Laboratory of Reliability and Intelligence of Electrical Equipment, Hebei University of Technology, Tianjin, 300130, P.r. China.
Abstract
Emotion recognition based on brain-computer interface (BCI) has attracted important research attention despite its difficulty. It plays a vital role in human cognition and helps in making the decision. Many researchers use electroencephalograms (EEG) signals to study emotion because of its easy and convenient. Deep learning has been employed for the emotion recognition system. It recognizes emotion into single or multi-models, with visual or music stimuli shown on a screen. In this article, the convolutional neural network (CNN) model is introduced to simultaneously learn the feature and recognize the emotion of positive, neutral, and negative states of pure EEG signals single model based on the SJTU emotion EEG dataset (SEED) with ResNet50 and Adam optimizer. The dataset is shuffle, divided into training and testing, and then fed to the CNN model. The negative emotion has the highest accuracy of 94.86% fellow by neutral emotion with 94.29% and positive emotion with 93.25% respectively. With average accuracy of 94.13%. The results showed excellent classification ability of the model and can improve emotion recognition.
Publisher
World Scientific and Engineering Academy and Society (WSEAS)
Subject
Computer Networks and Communications,Computer Vision and Pattern Recognition,Signal Processing,Software
Reference86 articles.
1. Y. Luo, G. Wu, S. Qiu, S. Yang, W. Li, and Y. Bi, "EEG-based Emotion Classification Using Deep Neural Network and Sparse Autoencoder," Frontiers in Systems Neuroscience, 2020, vol. 14, p. 43. 2. W.-L. Zheng, W. Liu, Y. Lu, B.-L. Lu, and A. Cichocki, "Emotionmeter: A multimodal framework for recognizing human emotions," IEEE transactions on cybernetics,2018, vol. 49, no. 3, pp. 1110-1122. 3. S. Paul, A. Banerjee, and D. Tibarewala, "Emotional eye movement analysis using electrooculography signal," International Journal of Biomedical Engineering and Technology, 2017, vol. 23, no. 1, pp. 59-70. 4. A. Raheel, M. Majid, M. Alnowami, and S. M. Anwar, "Physiological sensors based emotion recognition while experiencing tactile enhanced multimedia," Sensors, 2020, vol. 20, no. 14, p. 4037. 5. S. Farashi and R. Khosrowabadi, "EEG based emotion recognition using minimum spanning tree," Physical and Engineering Sciences in Medicine, 2020, vol. 43, no. 3, pp. 985-996.
Cited by
15 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|