1. Sentence-t5: Scalable sentence encoders from pre-trained text-to-text models;ni;ArXiv Preprint,2021
2. Long-former: The long-document transformer;beltagy;ArXiv Preprint,2020
3. A summary of the coliee 2019 competition;rabelo;New Frontiers in Artificial Intelligence JSAI-isAI International Workshops JURISIN AI-Biz LENLS,2019
4. Llama: Open and efficient foundation language models;touvron;ArXiv Preprint,2023
5. BERT: Pre-training of deep bidirectional transformers for language understanding;devlin;NAACL,2019