1. SG-Net: Syntax-guided machine reading comprehension;Zhang;AAAI,2020
2. Z. Yang, Z. Dai, Y. Yang, J. Carbonell, R.R. Salakhutdinov, Q.V. Le, XLNet: generalized autoregressive pretraining for language understanding, in: Advances in neural information processing systems, 2019, pp. 5753–5763.
3. BERT: Pre-training of deep bidirectional transformers for language understanding;Devlin,2018
4. SQuAD: 100,000+ questions for machine comprehension of text;Rajpurkar,2016
5. MultiQA: an empirical investigation of generalization and transfer in reading comprehension;Talmor,2019