1. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Burstein, J., Doran, C., Solorio, T. (eds.) Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (Jun 2019)
2. Gordon, M.D., Dumais, S.: Using latent semantic indexing for literature based discovery. J. Am. Soc. Inf. Sci. 49(8), 674–685 (1998)
3. Grootendorst, M.: Keybert: minimal keyword extraction with bert (2020)
4. Grootendorst, M.: BERTopic: neural topic modeling with a class-based TF-IDF procedure (2022)
5. Hofstätter, S., Althammer, S., Schröder, M., Sertkan, M., Hanbury, A.: Improving efficient neural ranking models with cross-architecture knowledge distillation (2020)