1. R. Iida et al., “Intra-Sentential Subject Zero Anaphora Resolution Using Multi-Column Convolutional Neural Network,” Proc. of the 2016 Conf. on Empirical Methods in Natural Language Processing, pp. 1244-1254, 2016. http://doi.org/10.18653/v1/D16-1132
2. R. Sasano and S. Kurohashi, “A Discriminative Approach to Japanese Zero Anaphora Resolution with Large-Scale Lexicalized Case Frames,” Proc. of 5th Int. Joint Conf. on Natural Language Processing, pp. 758-766, 2011.
3. H. J. Levesque et al., “The Winograd Schema Challenge,” Proc. of the 13th Int. Conf. on Principles of Knowledge Representation and Reasoning, pp. 552-561, 2012.
4. J. Devlin et al., “BERT: Pre-Training of Deep Bidirectional Transformers for Language Understanding,” Proc. of the 2019 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Vol.1 (Long and Short Papers), pp. 4171-4186, 2019.
5. Y. Liu et al., “RoBERTa: A Robustly Optimized BERT Pretraining Approach,” arXiv:1907.11692, 2020.