1. Brown, T., et al.: Language models are few-shot learners. In: Advances in Neural Information Processing Systems, vol. 33, pp. 1877–1901 (2020)
2. Debbarma, R., Prawar, P., Chakraborty, A., Bedathur, S.: IITDLI: legal case retrieval based on lexical models. In: Workshop of the Tenth Competition on Legal Information Extraction/Entailment (COLIEE 2023) in the 19th International Conference on Artificial Intelligence and Law (ICAIL) (2023)
3. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
4. Goebel, R., Kano, Y., Kim, M.Y., Rabelo, J., Satoh, K., Yoshioka, M.: Summary of the competition on legal information, extraction/entailment (COLIEE) 2023. In: Proceedings of the Nineteenth International Conference on Artificial Intelligence and Law, ICAIL 2023, pp. 472–480. Association for Computing Machinery, New York (2023). https://doi.org/10.1145/3594536.3595176
5. Hoang, L., Bui, T., Nguyen, C., Nguyen, L.M.: AIEPU at ALQAC 2023: deep learning methods for legal information retrieval and question answering. In: 2023 15th International Conference on Knowledge and Systems Engineering (KSE), pp. 1–6 (2023). https://doi.org/10.1109/KSE59128.2023.10299426