Publisher
Springer Nature Switzerland
Reference39 articles.
1. Adebara, I., Elmadany, A., Abdul-Mageed, M., Alcoba Inciarte, A.: SERENGETI: massively multilingual language models for Africa. In: Findings of the Association for Computational Linguistics: ACL 2023, pp. 1498–1537. Association for Computational Linguistics, Toronto, Canada (2023). https://doi.org/10.18653/v1/2023.findings-acl.97
2. Adelani, D., et al.: A few thousand translations go a long way! leveraging pre-trained models for African news translation. In: Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 3053–3070. Association for Computational Linguistics, Seattle, United States (2022). https://doi.org/10.18653/v1/2022.naacl-main.223
3. Adelani, D., et al.: MasakhaNER 2.0: Africa-centric transfer learning for named entity recognition. In: Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pp. 4488–4508 (2022)
4. Agić, Ž., Vulić, I.: JW300: a wide-coverage parallel corpus for low-resource languages. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 3204–3210. Association for Computational Linguistics, Florence, Italy (2019). 10.18653/v1/P19-1310
5. Alabi, J.O., Adelani, D.I., Mosbach, M., Klakow, D.: Adapting pre-trained language models to African languages via multilingual adaptive fine-tuning. In: Proceedings of the 29th International Conference on Computational Linguistics, pp. 4336–4349. International Committee on Computational Linguistics, Gyeongju, Republic of Korea (2022)
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献