1. Choudhry, A., et al.: Transformer-based named entity recognition for French using adversarial adaptation to similar domain corpora (2022). https://doi.org/10.48550/ARXIV.2212.03692. Accessed 11 Jan 2023
2. Copara, J., Knafou, J., Naderi, N., Moro, C., Ruch, P., Teodoro, D.: Contextualized French language models for biomedical named entity recognition. In: Actes de la 6e conférence conjointe Journées d’Études sur la Parole (JEP, 33e édition), Traitement Automatique des Langues Naturelles (TALN, 27e édition), Rencontre des Étudiants Chercheurs en Informatique pour le Traitement Automatique des Langues (RÉCITAL, 22e édition). Atelier DÉfi Fouille de Textes. pp. 36–48. ATALA et AFCP, Nancy, France (2020), https://aclanthology.org/2020.jeptalnrecital-deft.4. Accessed 11 Jan 2023
3. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423. Accessed 11 Jan 2023
4. Ganin, Y., Lempitsky, V.: Unsupervised Domain Adaptation by Backpropagation (2014). https://doi.org/10.48550/ARXIV.1409.7495. Accessed 11 Jan 2023
5. Ganin, Y., et al.: Domain-adversarial training of neural networks. J. Mach. Learn. Res. 17(1), 2096–2030 (2016). http://jmlr.org/papers/v17/15-239.html. Accessed 11 Jan 2023