1. Language models are few-shot learners;Brown;Adv. Neural Inf. Process. Syst.,2020
2. Scao, T.L., Fan, A., Akiki, C., Pavlick, E., Ilić, S., Hesslow, D., Castagné, R., Luccioni, A.S., Yvon, F., and Gallé, M. (2022). Bloom: A 176b-parameter open-access multilingual language model. arXiv.
3. OpenAI (2023). GPT-4 Technical Report. arXiv.
4. Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., and Sutskever, I. (2019). Language Models are Unsupervised Multitask Learners. OpenAI Blog, Available online: https://insightcivic.s3.us-east-1.amazonaws.com/language-models.pdf.
5. Keskar, N.S., McCann, B., Varshney, L.R., Xiong, C., and Socher, R. (2019). CTRL: A Conditional Transformer Language Model for Controllable Generation. arXiv.