Author:
Li Xianyong,Ding Li,Du Yajun,Fan Yongquan,Shen Fashan
Abstract
Aspect-level sentiment classification (ASC) is an interesting and challenging research task to identify the sentiment polarities of aspect words in sentences. Previous attention-based methods rarely consider the position information of aspect and contextual words. For an aspect word in a sentence, its adjacent words should be given more attention than the long distant words. Based on this consideration, this article designs a position influence vector to represent the position information between an aspect word and the context. By combining the position influence vector, multi-head self-attention mechanism and bidirectional gated recurrent unit (BiGRU), a position-enhanced multi-head self-attention network based BiGRU (PMHSAT-BiGRU) model is proposed. To verify the effectiveness of the proposed model, this article makes a large number of experiments on SemEval2014 restaurant, SemEval2014 laptop, SemEval2015 restaurant, and SemEval2016 restaurant data sets. The experiment results show that the performance of the proposed PMHSAT-BiGRU model is obviously better than the baselines. Specially, compared with the original LSTM model, the Accuracy values of the proposed PMHSAT-BiGRU model on the four data sets are improved by 5.72, 6.06, 4.52, and 3.15%, respectively.
Funder
National Natural Science Foundation of China
Reference40 articles.
1. Multi-task learning for aspect term extraction and aspect sentiment classification;Akhtar;Neurocomputing,2020
2. Knowledge-enhanced neural networks for sentiment analysis of Chinese reviews;Chen;Neurocomputing,2019
3. Recurrent attention network on memory for aspect sentiment analysis;Chen
4. Enhancing recurrent neural networks with positional attention for question answering;Chen
5. Gate-variants of gated recurrent unit (GRU) neural networks;Dey,2017
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献