1. Arnold, S., van Aken, B., Grundmann, P., Gers, F.A., Löser, A.: Learning contextualized document representations for healthcare answer retrieval. In: WWW ’20: Proceedings of the Web Conference 2020, pp. 1332–1343 (2020)
2. Beltagy, I., Peters, M.E., Cohan, A.: Longformer: the long-document transformer. arXiv preprint arXiv:2004.05150 (2020)
3. Costa, W.M., Pedrosa, G.V.: A textual representation based on bag-of-concepts and thesaurus for legal information retrieval. In: Anais do X Symposium on Knowledge Discovery, Mining and Learning, pp. 114–121. SBC (2022)
4. Cui, Y., Che, W., Liu, T., Qin, B., Wang, S., Hu, G.: Revisiting pre-trained models for Chinese natural language processing. In: Findings of the Association for Computational Linguistics: EMNLP 2020, pp. 657–668 (2020)
5. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 17th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1, pp. 4171–4186 (2019)