1. Beltagy, I., Peters, M.E., Cohan, A.: LongFormer: the long-document transformer. CoRR abs/2004.05150 (2020)
2. Chen, D., Bolton, J., Manning, C.D.: A thorough examination of the CNN/daily mail reading comprehension task. In: ACL (2016)
3. Chen, Z., Cui, Y., Ma, W., Wang, S., Hu, G.: Convolutional spatial attention model for reading comprehension with multiple-choice questions. In: AAAI, pp. 6276–6283 (2019)
4. Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT, pp. 4171–4186 (2019)
5. Duan, L., Gao, J., Li, A.: A study on solution strategy of option-problems in machine reading comprehension. J. Chin. Inf. Process. 33(10), 81–89 (2019)