Abstract
Neural network-based encoder–decoder (ED) models are widely used for abstractive text summarization. While the encoder first reads the source document and embeds salient information, the decoder starts from such encoding to generate the summary word-by-word. However, the drawback of the ED model is that it treats words and sentences equally, without discerning the most relevant ones from the others. Many researchers have investigated this problem and provided different solutions. In this paper, we define a sentence-level attention mechanism based on the well-known PageRank algorithm to find the relevant sentences, then propagate the resulting scores into a second word-level attention layer. We tested the proposed model on the well-known CNN/Dailymail dataset, and found that it was able to generate summaries with a much higher abstractive power than state-of-the-art models, in spite of an unavoidable (but slight) decrease in terms of the Rouge scores.
Subject
Fluid Flow and Transfer Processes,Computer Science Applications,Process Chemistry and Technology,General Engineering,Instrumentation,General Materials Science
Reference42 articles.
1. Sequence to sequence learning with neural networks;Sutskever;Adv. Neural Inf. Process. Syst.,2014
2. Learning phrase representations using RNN encoder-decoder for statistical machine translation;Cho;Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP),2014
3. A neural attention model for abstractive sentence summarization;Rush;Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing,2015
4. Abstractive text summarization using sequence-to-sequence rnns and beyond;Nallapati;Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning (CoNLL),2016
5. Abstractive sentence summarization with attentive recurrent neural networks;Chopra;Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies,2016
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献