1. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of NAACL (2019)
2. Ministry of Education, Culture, Sports, Science and Technology of Japan: The courses of study (2017 version) (2017). https://www.mext.go.jp/a_menu/shotou/new-cs/youryou/eiyaku/1298356.html
3. Ehara, Y.: No meaning left unlearned: predicting learners’ knowledge of atypical meanings of words from vocabulary tests for their typical meanings. In: Proceedings of Educational Data Mining (short paper) (2022)
4. Fujishiro, N., Otaki, Y., Kawachi, S.: Accuracy of the sentence-BERT semantic search system for a Japanese database of closed medical malpractice claims. Appl. Sci. 13(6) (2023). https://doi.org/10.3390/app13064051. https://www.mdpi.com/2076-3417/13/6/4051
5. He, P., Liu, X., Gao, J., Chen, W.: DeBERTa: decoding-enhanced BERT with disentangled attention. arXiv preprint arXiv:2006.03654 (2020)