1. AraBERT: Transformer-based model for arabic language understanding;Antoun Wissam;arXiv preprint arXiv:2003.00104,2020
2. Andrew Bawitlung, Sandeep Kumar Dash, Robert Lalramhluna, and Alexander Gelbukh. 2024. An approach to Mizo language news classification using machine learning. In Data Science and Network Engineering, Suyel Namasudra, Munesh Chandra Trivedi, Ruben Gonzalez Crespo, and Pascal Lorenz (Eds.). Springer Nature Singapore, Singapore, 165–180.
3. Jereemi Bentham, Partha Pakray, Goutam Majumder, Sunday Lalbiaknia, and Alexander Gelbukh. 2016. Identification of rules for recognition of named entity classes in Mizo language. In Proceedings of the 2016 15th Mexican International Conference on Artificial Intelligence (MICAI’16). IEEE, 8–13.
4. Tom B. Brown Benjamin Mann Nick Ryder Melanie Subbiah Jared Kaplan Prafulla Dhariwal Arvind Neelakantan Pranav Shyam Girish Sastry Amanda Askell Sandhini Agarwal Ariel Herbert-Voss Gretchen Krueger Tom Henighan Rewon Child Aditya Ramesh Daniel M. Ziegler Jeffrey Wu Clemens Winter Christopher Hesse Mark Chen Eric Sigler Mateusz Litwin Scott Gray Benjamin Chess Jack Clark Christopher Berner Sam McCandlish Alec Radford Ilya Sutskever and Dario Amodei. 2020. Language models are few-shot learners. arxiv:2005.14165 [cs.CL] (2020).
5. German’s Next Language Model