Affiliation:
1. State Key Laboratory of Media Convergence and Communication, Communication University of China, Dingfuzhuang, Beijing 100024, China
Abstract
As a branch of sentiment analysis tasks, emotion recognition in conversation (ERC) aims to explore the hidden emotions of a speaker by analyzing the sentiments in utterance. In addition, emotion recognition in multimodal data from conversation includes the text of the utterance and its corresponding acoustic and visual data. By integrating features from various modalities, the emotion of utterance can be more accurately predicted. ERC research faces challenges in context construction, speaker dependency design, and multimodal heterogeneous feature fusion. Therefore, this review starts by defining the ERC task, developing the research work, and introducing the utilized datasets in detail. Simultaneously, we analyzed context modeling in conversations, speaker dependency, and methods for fusing multimodal information concerning existing research work for evaluation purposes. Finally, this review also explores the research, application challenges, and opportunities of ERC.
Funder
National Key R&D Program of China
Fundamental Research Funds for the Central Universities
Subject
Electrical and Electronic Engineering,Computer Networks and Communications,Hardware and Architecture,Signal Processing,Control and Systems Engineering
Reference68 articles.
1. The Graph Neural Network Model;Scarselli;IEEE Trans. Neural Netw.,2009
2. Long short-term memory;Hochreiter;Neural Comput.,1997
3. Poria, S., Hazarika, D., Majumder, N., Naik, G., Cambria, E., and Mihalcea, R. (August, January 28). MELD: A Multimodal Multi-Party Dataset for Emotion Recognition in Conversations. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy.
4. Pan-Cultural Elements in Facial Displays of Emotion;Ekman;Science,1969
5. Pleasure, Arousal, Dominance: Mehrabian and Russell revisited;Bakker;Curr. Psychol.,2014
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献