Author:
Lakshmi S. Sudha,Rani M. Usha
Abstract
Text summarization is the process of employing a system to shorten a document or a collection of documents into brief paragraphs or sentences using various approaches. This paper presents text categorization using BERT to improve summarization task which is a state-of-the-art deep learning language processing model that performs significantly better than all other previous language models. Multi-document summarization (MDS) has got its bottleneck due to lack of training data and varied categories of documents. Aiming in this direction, the proposed novel hybrid summarization B-HEATS (Bert based Hybrid Extractive Abstractive Text Summarization)framework is a combination of extractive summary via categorization and abstractive summary using deep learning architecture RNN-LSTM-CNN to fine-tune BERT which results in the qualitative summary for multiple documents and overcomes out of vocabulary (OOV). The output layer of BERT is replaced using RNN-LSTM-CNN architecture to fine tune which improves the summarization model. The proposed automatic text summarization is compared over the existing models in terms of performance measures like ROUGE metrics achieves high scores as R1 score 43.61, R2 score 22.64, R3 score 44.95 and RL score is 44.27 on Benchmark DUC datasets.
Publisher
Universidad Tecnica de Manabi
Subject
Education,General Nursing
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Question Categorizer: An Automated Approach For Organizing Queries;2024 2nd International Conference on Networking and Communications (ICNWC);2024-04-02
2. Abstractive Text Summarization Using BERT for Feature Extraction and Seq2Seq Model for Summary Generation;2023 International Conference on Modeling & E-Information Research, Artificial Learning and Digital Applications (ICMERALDA);2023-11-24