Affiliation:
1. Shanghai University of Engineering Science, Songjiang Qu, China
2. Shanghai Jiao Tong University, Minhang Qu, China
Abstract
With its unique information-filtering function, text summarization technology has become a significant aspect of search engines and question-and-answer systems. However, existing models that include the copy mechanism often lack the ability to extract important fragments, resulting in generated content that suffers from thematic deviation and insufficient generalization. Specifically, Chinese automatic summarization using traditional generation methods often loses semantics because of its reliance on word lists. To address these issues, we proposed the novel BioCopy mechanism for the summarization task. By training the tags of predictive words and reducing the probability distribution range on the glossary, we enhanced the ability to generate continuous segments, which effectively solves the above problems. Additionally, we applied reinforced canonicality to the inputs to obtain better model results, making the model share the sub-network weight parameters and sparsing the model output to reduce the search space for model prediction. To further improve the model’s performance, we calculated the bilingual evaluation understudy (BLEU) score on the English dataset CNN/DailyMail to filter the thresholds and reduce the difficulty of word separation and the dependence of the output on the word list. We fully fine-tuned the model using the LCSTS dataset for the Chinese summarization task and conducted small-sample experiments using the CSL dataset. We also conducted ablation experiments on the Chinese dataset. The experimental results demonstrate that the optimized model can learn the semantic representation of the original text better than other models and performs well with small sample sizes.
Funder
Scientific and Technological Innovation 2030 - major project of new generation artificial intelligence
Publisher
Association for Computing Machinery (ACM)
Reference34 articles.
1. A survey automatic text summarization
2. Regularizing Output Distribution of Abstractive Chinese Social Media Text Summarization for Improved Semantic Consistency
3. Text summarization techniques: A brief survey;Allahyari Mehdi;arXiv preprint arXiv:1707.02268,2017
4. Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. In Advances in Neural Information Processing Systems. 5998–6008.
5. Pre-trained models for natural language processing: A survey;Qiu Xipeng;Science China Technological Sciences,2020