Affiliation:
1. School of Automation and Electronic Information, Xiangtan University, Xiangtan, China
2. National CIMS Engineering Research Center, Tsinghua University, Beijing, China
Abstract
Most aspect-level sentiment classification networks include the long short-term memory (LSTM) network, coupled with attention mechanism and memory module, is becoming widely applied in aspect-level sentiment classification. Although it has achieved good results, it cannot extract the global and local information of the context at the same time, and it is only based on the semantic relatedness between an aspect and its corresponding context words to model, while neglecting their syntactic dependencies. This paper proposes the aspect-level sentiment classification by combining convolutional neural network (CNN) and proximity-weighted convolution network (PWCN), as well as a new method to calculate the proximity weight. To obtain contextualized word vectors, corpora has been trained by the model of bidirectional encoder representations from transformers (BERT), which can be taken as text features. The CNN is able to extract sequence features from the text and to take the sequence information from the text into account. In addition, the PWCN can consider the syntactic dependencies inside the sentences. The BERT model also has the ability to model complex features of words, such as their syntactic and semantic changes in a linguistic context. Experiments conducted on the SemEval 2014 benchmark demonstrate compared to the well-established ones, the proposed approach had bigger effectiveness.
Funder
National Key Research and Development Project
Subject
Applied Mathematics,Control and Optimization,Instrumentation