Abstract
Abstract
Text generation, which combines artificial intelligence as well as computational linguistics to write new text, is a subfield of Natural Language Processing (NLP). Significant advancements in text generation have been accomplished recently, producing human-like text. The most recent text generation models like LSTM, GPT, and BART are changing the field. It has recently become quite popular across a variety of industries, including news, reviews, social networks, and poetry composition, to mention a few. In our proposed work, the process of generating automatic text includes training a model that takes input data and generates fresh content related to the subject of the input data. We used BERTSCORE, an assessment metric for language generation evaluation metric, it calculates the similarity between two phrases, the input and the generated text, as the sum of the cosine similarities between their token embeddings. In comparison to previous metrics, BERTSCORE has a higher correlation with human assessments and offers superior model selection performance. The generated text is being preprocessed and we use the cleaned data for sentiment classification. Natural language processing's significant role in sentiment analysis has drawn a lot of attention recently. Models like BERT have demonstrated tremendous effectiveness in capturing the contextual details of text since the rise of deep learning techniques. In this paper, we explore the application of BERT for sentiment analysis on text generated by a language model. We test different methods for optimizing BERT and assess the performance of our models using news datasets. Our findings show that BERT can accurately categorize sentiment in generated text, the experimental results for the sentiment classification task for the GPT-2 generated text are 94%, and for BART is 96%.
Publisher
Research Square Platform LLC
Reference31 articles.
1. A Systematic Literature Review on Text Generation Using Deep Neural Network Models;Fatima N;IEEE Access,2022
2. Paper D (2021) Automated Text Generation. in TensorFlow 2.x in the Colaboratory Cloud. Apress, Berkeley, CA, pp 183–202. doi: 10.1007/978-1-4842-6649-6_8.
3. Custom Text Generation Using GPT-2;Rai R;Jan,2021
4. Lewis M et al (2020) “BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension,” in Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Stroudsburg, PA, USA: Association for Computational Linguistics, pp. 7871–7880. doi: 10.18653/v1/2020.acl-main.703
5. Table Caption Generation in Scholarly Documents Leveraging Pre-trained Language Models;Xu JH;Aug,2021