1. OpenAI. Introducing {ChatGPT}. https://openai.com/blog/chatgpt/. 2022
2. Aohan Zeng and Xiao Liu and Zhengxiao Du and Zihan Wang and Hanyu Lai and Ming Ding and Zhuoyi Yang and Yifan Xu and Wendi Zheng and Xiao Xia and Weng Lam Tam and Zixuan Ma and Yufei Xue and Jidong Zhai and Wenguang Chen and Zhiyuan Liu and Peng Zhang and Yuxiao Dong and Jie Tang (2023) {GLM}-130{B}: An Open Bilingual Pre-trained Model. 1--56, The Eleventh International Conference on Learning Representations
3. Du, Z and Qian, Y and Liu, X and Ding, M and Qiu, J and Yang, Z and others (2022) {GLM}: {G}eneral language model pretraining with autoregressive blank infilling. 320--335, Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics, 1
4. Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and others. {LLaMA}: Open and efficient foundation language models. {a}rXiv preprint \href{https://arxiv.org/abs/2302.13971}{arXiv:2302.13971}. 2023
5. Yang, Aiyuan and Xiao, Bin and Wang, Bingning and Zhang, Borong and Bian, Ce and Yin, Chao and Lv, Chenxu and Pan, Da and Wang, Dian and Yan, Dong and others. Baichuan 2: Open large-scale language models. {a}rXiv preprint \href{https://arxiv.org/abs/2309.10305}{arXiv:2309.10305}. 2023