Text Summarization Method Based on Gated Attention Graph Neural Network

Author:

Huang Jingui1ORCID,Wu Wenya1,Li Jingyi1,Wang Shengchun1

Affiliation:

1. College of Information Science and Engineering, Hunan Normal University, Changsha 410081, China

Abstract

Text summarization is an information compression technology to extract important information from long text, which has become a challenging research direction in the field of natural language processing. At present, the text summary model based on deep learning has shown good results, but how to more effectively model the relationship between words, more accurately extract feature information and eliminate redundant information is still a problem of concern. This paper proposes a graph neural network model GA-GNN based on gated attention, which effectively improves the accuracy and readability of text summarization. First, the words are encoded using a concatenated sentence encoder to generate a deeper vector containing local and global semantic information. Secondly, the ability to extract key information features is improved by using gated attention units to eliminate local irrelevant information. Finally, the loss function is optimized from the three aspects of contrastive learning, confidence calculation of important sentences, and graph feature extraction to improve the robustness of the model. Experimental validation was conducted on a CNN/Daily Mail dataset and MR dataset, and the results showed that the model in this paper outperformed existing methods.

Funder

National Natural Science Foundation of China

Research and development projects

Publisher

MDPI AG

Subject

Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry

Reference33 articles.

1. Progress and trends in text summarization research;Mingtuo;Chin. J. Netw. Inf. Secur.,2018

2. Survey on Abstractive Text Summarization Technologies Based on Deep Learning;Zhu;Comput. Eng.,2021

3. Text classification method based on abstracted text summarization;Zhang;J. Chang. Univ. Technol.,2021

4. An Improved TextRank for Tibetan Summarization;Wei;J. Chin. Inf. Process.,2020

5. Abstractive Summarization Model Based on Mixed Attention Mechanism;Li;Mod. Comput.,2022

Cited by 7 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3