Author:
Yagi Sane Mo,Mansour Youssef,Kamalov Firuz,Elnagar Ashraf
Reference39 articles.
1. Unsu-pervised cross-lingual representation learning at scale;conneau;ArXiv Preprint,2019
2. Albert: A lite bert for self-supervised learning of language representations;lan;ArXiv Preprint,2019
3. ARBERT & MARBERT: Deep Bidirectional Transformers for Arabic
4. Benchmarking Transformer-based Language Models for Arabic Sentiment and Sarcasm Detection;farha;Arabic Natural Language Processing Workshop,2021
5. Contextual semantic embeddings based on fine-tuned AraBERT model for Arabic text multi-class categorization
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献