1. Chiu, Billy, Baker, Simon: Word embeddings for biomedical natural language processing: a survey. Language Linguistics Compass 14(12), e12402 (2020)
2. Peters, M. E., Neumann, M., Zettlemoyer, L., Yih, W.: Dissecting contextual word embeddings: architecture and representation.” arXiv preprint arXiv:1808.08949 (2018)
3. Min, B., Ross, H., Sulem, E., Veyseh, A. P. B., Nguyen, T. H, Sainz, O., Agirre, E., Heintz, E., Roth, D.: Recent advances in natural language processing via large pre-trained language models: a survey. ACM Comput. Survey. 56(2), 1–40 (2023)
4. Kanade, A., Maniatis, P., Balakrishnan, G., Shi, K.: Pre-trained contextual embedding of source code. (2019)
5. Kanade, A., Maniatis, P., Balakrishnan, G., Shi, K.: Learning and evaluating contextual embedding of source code. In: International conference on machine learning, pp. 5110–5121. PMLR (2020)