1. Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT:pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1, pp. 4171–4186 (2019)
2. Governatori, G., Bench-Capon, T., Verheij, B., Araszkiewicz, M., Francesconi, E., Grabmair, M.: Thirty years of artificial intelligence and law: the first decade. Artif. Intell. Law 30(4), 481–519 (2022)
3. He, P., Liu, X., Gao, J., Chen, W.: DeBERTa: decoding-enhanced BERT with disentangled attention. In: Proceedings of the 9th International Conference on Learning Representations (2021)
4. Huang, Q., Luo, X.: State-of-the-art and development trend of artificial intelligence combined with law. Computer Sci. 45(12), 1–11 (2018)
5. Jin, D., Gao, S., Kao, J.Y., Chung, T., Hakkani-Tur, D.: MMM: Multi-stage multi-task learning for multi-choice reading comprehension. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 8010–8017 (2020)