Affiliation:
1. Computer Science Department, King Abdulaziz University, Jeddah, Saudi Arabia
2. Information Systems Department, King Abdulaziz University, Jeddah, Saudi Arabia
Abstract
Multitask learning (MTL) is a machine learning paradigm where a single model is trained to perform several tasks simultaneously. Despite the considerable amount of research on MTL, the majority of it has been centered around English language, while other language such as Arabic have not received as much attention. Most existing Arabic NLP techniques concentrate on single or multitask learning, sharing just a limited number of tasks, between two or three tasks. To address this gap, we present ArMT-TNN, an Arabic Multi-Task Learning using Transformer Neural Network, designed for Arabic natural language understanding (ANLU) tasks. Our approach involves sharing learned information between eight ANLU tasks, allowing for a single model to solve all of them. We achieve this by fine-tuning all tasks simultaneously and using multiple pre-trained Bidirectional Transformer language models, like BERT, that are specifically designed for Arabic language processing. Additionally, we explore the effectiveness of various Arabic language models (LMs) that have been pre-trained on different types of Arabic text, such as Modern Standard Arabic (MSA) and Arabic dialects. Our approach demonstrated outstanding performance compared to all current models on four test sets within the ALUE benchmark, namely MQ2Q, OOLD, SVREG, and SEC, by margins of 3.9%, 3.8%, 10.1%, and 3.7%, respectively. Nonetheless, our approach did not perform as well on the remaining tasks due to the negative transfer of knowledge. This finding highlights the importance of carefully selecting tasks when constructing a benchmark. Our experiments also show that LMs which were pretrained on text types that differ from the text type used for finetuned tasks can still perform well.
Reference6 articles.
1. Vaswani A, et al. Attention is all you need. Advances in Neural Information Processing Systems. 2017; 30.
2. End-to-end argument mining with cross-corpora multi-task learning;Morio;Transactions of the Association for Computational Linguistics,2022
3. Cross-modal multitask transformer for end-to-end multimodal aspect-based sentiment analysis;Yang;Information Processing & Management,2022
4. Vad-assisted multitask transformer framework for emotion recognition and intensity prediction on suicide notes;Ghosh;Information Processing & Management,2023
5. Few-shot parameter-efficient fine-tuning is better and cheaper than in-context learning;Liu;Advances in Neural Information Processing Systems,2022