Affiliation:
1. Department of Information Engineering and Computer Science, Feng Chia University, Taichung 407, Taiwan
2. Department of Information Management, Chaoyang University of Technology, Taichung 413, Taiwan
Abstract
Text classification is an important research field in text mining and natural language processing, gaining momentum with the growth of social networks. Despite the accuracy advancements made by deep learning models, existing graph neural network-based methods often overlook the implicit class information within texts. To address this gap, we propose a graph neural network model named LaGCN to improve classification accuracy. LaGCN utilizes the latent class information in texts, treating it as explicit class labels. It refines the graph convolution process by adding label-aware nodes to capture document–word, word–word, and word–class correlations for text classification. Comparing LaGCN with leading-edge models like HDGCN and BERT, our experiments on Ohsumed, Movie Review, 20 Newsgroups, and R8 datasets demonstrate its superiority. LaGCN outperformed existing methods, showing average accuracy improvements of 19.47%, 10%, 4.67%, and 0.4%, respectively. This advancement underscores the importance of integrating class information into graph neural networks, setting a new benchmark for text classification tasks.
Funder
National Science and Technology Council
Reference26 articles.
1. Kim, Y. (2014). Convolutional neural networks for sentence classification. arXiv.
2. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł., and Polosukhin, I. (2017). Attention is all you need. Adv. Neural Inf. Process. Syst.
3. Enhancing Text Classification by Graph Neural Networks with Multi-Granular Topic-Aware Graph;Gu;IEEE Access,2023
4. Integration of global and local information for text classification;Li;Neural Comput. Appl.,2023
5. Graph neural networks for text classification: A survey;Wang;Artif. Intell. Rev.,2024