1. Enriching Word Vectors with Subword Information
2. Does BERT make any sense? Interpretable word sense disambiguation with contextualized embeddings;wiedemann,2019
3. BERT: Pre-training of deep bidirectional transformers for language understanding;devlin;Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics Human Language Technologies,2019
4. BERTopic: Neural topic modeling with a class-based TF-IDF procedure;grootendorst,2022
5. Topic models for taxonomies