Affiliation:
1. VRAIN: Valencian Research Institute for Artificial Intelligence, Universitat Politècnica de València, Camí de Vera sn, València, Spain
Abstract
In this paper, we present an extractive approach to document summarization, the Siamese Hierarchical Transformer Encoders system, that is based on the use of siamese neural networks and the transformer encoders which are extended in a hierarchical way. The system, trained for binary classification, is able to assign attention scores to each sentence in the document. These scores are used to select the most relevant sentences to build the summary. The main novelty of our proposal is the use of self-attention mechanisms at sentence level for document summarization, instead of using only attentions at word level. The experimentation carried out using the CNN/DailyMail summarization corpus shows promising results in-line with the state-of-the-art.
Subject
Artificial Intelligence,General Engineering,Statistics and Probability
Reference7 articles.
1. Begum N. , Fattah M. and Ren F. , Automatic text summarization using support vector machine, 5 (2009), 1987–1996.
2. Lexrank: Graph-based lexical centrality as salience in text summarization;Erkan;J Artif Int Res,2004
3. Siamese hierarchical attention networksfor extractive Summarization;González;Journal of Intelligent & Fuzzy Systems,2019
4. Text summarisation in progress: a literature review;Lloret;Artificial Intelligence Review,2012
5. Automatically assessing machine summary content without a gold standard;Louis;Comput Linguist,2013
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献