Improving Abstractive Dialogue Summarization Using Keyword Extraction
-
Published:2023-08-29
Issue:17
Volume:13
Page:9771
-
ISSN:2076-3417
-
Container-title:Applied Sciences
-
language:en
-
Short-container-title:Applied Sciences
Author:
Yoo Chongjae1ORCID, Lee Hwanhee2ORCID
Affiliation:
1. LG Electronics, Seoul 06772, Republic of Korea 2. Department of Artificial Intelligence, Chung-Ang University, Seoul 06974, Republic of Korea
Abstract
Abstractive dialogue summarization aims to generate a short passage that contains important content for a particular dialogue spoken by multiple speakers. In abstractive dialogue summarization systems, capturing the subject in the dialogue is challenging owing to the properties of colloquial texts. Moreover, the system often generates uninformative summaries. In this paper, we propose a novel keyword-aware dialogue summarization system (KADS) that easily captures the subject in the dialogue to alleviate the problem mentioned above through the efficient usage of keywords. Specifically, we first extract the keywords from the input dialogue using a pre-trained keyword extractor. Subsequently, KADS efficiently leverages the keywords information of the dialogue to the transformer-based dialogue system by using the pre-trained keyword extractor. Extensive experiments performed on three benchmark datasets show that the proposed method outperforms the baseline system. Additionally, we demonstrate that the proposed keyword-aware dialogue summarization system exhibits a high-performance gain in low-resource conditions where the number of training examples is highly limited.
Funder
Institute for Information and Communications Technology Promotion & Evaluation
Subject
Fluid Flow and Transfer Processes,Computer Science Applications,Process Chemistry and Technology,General Engineering,Instrumentation,General Materials Science
Reference44 articles.
1. The Trend in using online meeting applications for learning during the period of pandemic COVID-19: A literature review;Pratama;J. Innov. Educ. Cult. Res.,2020 2. Zhong, M., Liu, Y., Xu, Y., Zhu, C., and Zeng, M. (2021). Dialoglm: Pre-trained model for long dialogue understanding and summarization. arXiv. 3. Zhang, Y., Sun, S., Galley, M., Chen, Y.C., Brockett, C., Gao, X., Gao, J., Liu, J., and Dolan, B. (2019). Dialogpt: Large-scale generative pre-training for conversational response generation. arXiv. 4. Nallapati, R., Zhou, B., dos Santos, C., Gulçehre, Ç., and Xiang, B. (2016, January 11–12). Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond. Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, Berlin, Germany. 5. Narayan, S., Cohen, S.B., and Lapata, M. (November, January 31). Don’t Give Me the Details, Just the Summary! Topic-Aware Convolutional Neural Networks for Extreme Summarization. Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|