1. BERT: Pre-training of deep bidirectional transformers for language understanding;jacob;Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics Human Language Technologies Volume 1 Minneapolis Minnesota Association for Computational Linguistics,2019
2. Pre-train, prompt, and predict: A systematic survey of prompting methods in natural language processing;liu;ACM Comput Surv,2022
3. The inception platform: Machine-assisted and knowledge-oriented interactive annotation;klie;Proceedings of the 27th International Conference on Computational Linguistics System Demonstrations Association for Computational Linguistics,2018
4. CamemBERT: a Tasty French Language Model
5. Named entity recognition for french medieval charters;aguilar;Workshop on Natural Language Processing for Digital Humanities (NLP4DH),2021