Affiliation:
1. Karpagam College of Engineering, India
Abstract
In recent years, sentiment analysis has become a focal point in natural language processing. Cross-lingual sentiment analysis is a particularly demanding yet essential task that seeks to construct models capable of effectively analyzing sentiments across a variety of languages. The primary motivation behind this research is to bridge the gap in current techniques that often struggle to perform well with low-resource languages, due to the scarcity of large, annotated datasets, and their unique linguistic characteristics. In light of these challenges, we propose a novel Multi-Stage Deep Learning Architecture (MSDLA) for cross-lingual sentiment analysis of the Tamil language, a low-resource language. Our approach utilizes transfer learning from a source language with abundant resources to overcome data limitations. Our proposed model significantly outperforms existing methods on the Tamil Movie Review dataset, achieving an accuracy, precision, recall, and F1-score of 0.8772, 0.8614, 0.8825, and 0.8718, respectively. ANOVA statistical comparison demonstrates that the MSDLA’s improvements over other models, including mT5, XLM, mBERT, ULMFiT, BiLSTM, LSTM with Attention, and ALBERT with Hugging Face English Embedding are significant, with p-values all less than 0.005. Ablation studies confirm the importance of both cross-lingual semantic attention and domain adaptation in our architecture. Without these components, the model’s performance drops to 0.8342 and 0.8043 in accuracy, respectively. Furthermore, MSDLA demonstrates robust cross-domain performance on the Tamil News Classification and Thirukkural datasets, achieving an accuracy of 0.8551 and 0.8624, respectively, significantly outperforming the baseline models. These findings illustrate the robustness and efficacy of our approach, making a significant contribution to cross-lingual sentiment analysis techniques, especially for low-resource languages.
Publisher
Association for Computing Machinery (ACM)
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献