1. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaizer L. Attention Is All You Need. NIPS 2017. Proceedings of Advances in Neural Information Processing Systems. 2017 Dec 4–9;Long Beach, CA;USA. https://arxiv.org/abs/1706.03762
2. OpenAI. GPT-4 Technical Report. https://doi.org/10.48550/arXiv.2303.08774
3. Brown TB, Mann B, Ryder N, Subbiah M, Kaplan J, Dhariwal P, Neelakantan A, Shyam P, Sastry G, Askell A, Agrawal S, Herbert-Voss A, Krueger G, Henighan T, Child R. Language Models are Few-Shot Learners. In: Proceedings of 34th Conference on Neural Information Processing Systems. 2020 Dec 6–12;Vancouver; Canada
4. Raffel C, Shazeer N, Roberts A, Lee K, Narang S, Matena M, Zhou Y, Li W, Liu PJ. Exploring the limits of transfer learning with a unified text-to-text transformer. JMLR. 2020;21(140):1–67.
5. Lee J, Yoon W, Kim S, Kim D, Kim S, So C, Kang J. BioBERT: a pre-trained biomedical language representation model for biomedical text mining. Bioinformatics. 2020;36(4):1234–40.