1. Radford A , Narasimhan K , Salimans T , Sutskever I . Improving language understanding by generative pre-training [Internet]. [cited 2023 Dec 26];Available from: https://www.mikecaptain.com/resources/pdf/GPT-1.pdf
2. Radford A , Wu J , Child R , Luan D , Amodei D , Sutskever I . Language Models are Unsupervised Multitask Learners [Internet]. [cited 2023 Dec 26];Available from: https://insightcivic.s3.us-east-1.amazonaws.com/language-models.pdf
3. Language models are few-shot learners;Adv Neural Inf Process Syst,2020
4. OpenAI,:, Achiam J , et al. GPT-4 Technical Report [Internet]. arXiv [cs.CL]. 2023; Available from: http://arxiv.org/abs/2303.08774
5. Touvron H , Lavril T , Izacard G , et al. LLaMA: Open and Efficient Foundation Language Models [Internet]. arXiv [cs.CL]. 2023; Available from: http://arxiv.org/abs/2302.13971