Author:
Ren Xiaoyu,Bai Yuanchen,Duan Huiyu,Fan Lei,Fei Erkang,Wu Geer,Ray Pradeep,Hu Menghan,Yan Chenyuan,Zhai Guangtao
Publisher
Springer Nature Singapore
Reference67 articles.
1. OpenAI: ChatGPT (2022). https://chat.openai.com/
2. OpenAI: GPT-4 technical report (2023)
3. Zeng, A., et al.: Glm-130b: an open bilingual pre-trained model. arXiv preprint arXiv:2210.02414 (2022)
4. Du, Z., et al.: GLM: general language model pretraining with autoregressive blank infilling. In: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 320–335 (2022)
5. Touvron, H., et al.: LLaMA: open and efficient foundation language models (2023)