Abstract
When reading texts for text classification tasks, a large number of words are irrelevant, and in text classification tasks, the traditional self-attention mechanism has the problem of weight distribution limitations. Therefore, a text classification model that combines an improved self-attention mechanism with a Skip-GRU (Skip-grate recurrent unit) network (SA-SGRU) is proposed in this paper. Firstly, Skip-GRU, the enhanced model of GRU (Grate Recurrent Unit), is used to skip the content that is not important for text classification when reading texts and only capture effective global information. Then, the improved self-attention mechanism is introduced to redistribute the weight of the deep text sequences. Secondly, the optimized CNN (convolutional neural network) is combined to bring up the local features of texts. Finally, a Softmax classifier is used to obtain the classification results of sample labels. Experimental results show that the proposed method can achieve better performance on three public datasets compared with other baseline methods. The ablation experiments also demonstrate the effectiveness of each module in the proposed model.
Funder
Natural Science Foundation of China
Natural Science Foundation of Hebei Province
Handan Science and Technology Bureau Foundation
Subject
Fluid Flow and Transfer Processes,Computer Science Applications,Process Chemistry and Technology,General Engineering,Instrumentation,General Materials Science
Reference32 articles.
1. Zhang, Q., Gao, T., and Liu, X. (2020). Public Environment Emotion Prediction Model Using LSTM Network. Sustainability, 12.
2. A Survey of Information Organization and Retrieval Based on Deep Learning;Xiong;Inf. Sci.,2020
3. Learning structured text representations;Liu;Trans. Assoc. Comput. Linguist.,2018
4. Analytics in the era of big data: The digital transformations and value creation in industrial marketing;Wang;Ind. Mark. Manag.,2020
5. Iram, A., Saeed, H.A., and Wasif, A. (2022). Big Data Testing Techniques: Taxonomy, Challenges and Future Trends. arXiv.
Cited by
12 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献