Author:
Perry Fordson Hayford,Xing Xiaofen,Guo Kailing,Xu Xiangmin
Abstract
Emotion recognition from affective brain-computer interfaces (aBCI) has garnered a lot of attention in human-computer interactions. Electroencephalographic (EEG) signals collected and stored in one database have been mostly used due to their ability to detect brain activities in real time and their reliability. Nevertheless, large EEG individual differences occur amongst subjects making it impossible for models to share information across. New labeled data is collected and trained separately for new subjects which costs a lot of time. Also, during EEG data collection across databases, different stimulation is introduced to subjects. Audio-visual stimulation (AVS) is commonly used in studying the emotional responses of subjects. In this article, we propose a brain region aware domain adaptation (BRADA) algorithm to treat features from auditory and visual brain regions differently, which effectively tackle subject-to-subject variations and mitigate distribution mismatch across databases. BRADA is a new framework that works with the existing transfer learning method. We apply BRADA to both cross-subject and cross-database settings. The experimental results indicate that our proposed transfer learning method can improve valence-arousal emotion recognition tasks.
Funder
National Natural Science Foundation of China
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献