1. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL HLT 2019 - 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - Proceedings Conference, vol. 1, pp. 4171–4186 (2018)
2. Korngiebel, D.M., Mooney, S.D.: Considering the possibilities and pitfalls of generative pre-trained transformer 3 (GPT-3) in healthcare delivery. NPJ Digit. Med. 4, 1–3 (2021). https://doi.org/10.1038/s41746-021-00464-x
3. Ajayi, D.: How BERT and GPT models change the game for NLP - Watson Blog
4. OpenAI. https://openai.com/. Accessed 20 Feb 2022
5. Koleck, T.A., Dreisbach, C., Bourne, P.E., Bakken, S.: Natural language processing of symptoms documented in free-text narratives of electronic health records: a systematic review. J. Am. Med. Inform. Assoc. 26, 364–379 (2019). https://doi.org/10.1093/jamia/ocy173