1. Attention is all you need;A Vaswani;Advances in Neural Information Processing Systems,2017
2. BERT: Pre-training of deep bidirectional transformers for language understanding;J Devlin;Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies,2019
3. TopicBERT: A topicenhanced neural language model fine-tuned for sentiment classification;Y Zhou;IEEE Transactions on Neural Networks and Learning Systems,2021
4. Attentionemotion-enhanced convolutional LSTM for sentiment analysis;F Huang;IEEE Transactions on Neural Networks and Learning Systems,2021
5. Pre-training with whole word masking for Chinese BERT;Y Cui;IEEE/ACM Transactions on Audio, Speech, and Language Processing,2021