Affiliation:
1. Northeastern University, Heping Qu, Shenyang City, China
Abstract
Most existing multi-document machine reading comprehension models mainly focus on understanding the interactions between the input question and documents, but ignore the following two kinds of understandings. First, to understand the semantic meaning of words in the input question and documents from the perspective of each other. Second, to understand the supporting cues for a correct answer from the perspective of intra-document and inter-documents. Ignoring these two kinds of important understandings would make the models overlook some important information that may be helpful for finding correct answers. To overcome this deficiency, we propose a deep understanding based model for multi-document machine reading comprehension. It has three cascaded deep understanding modules which are designed to understand the accurate semantic meaning of words, the interactions between the input question and documents, and the supporting cues for the correct answer. We evaluate our model on two large scale benchmark datasets, namely TriviaQA Web and DuReader. Extensive experiments show that our model achieves state-of-the-art results on both datasets.
Funder
National Natural Science Foundation of China
Fundamental Research Funds for the Central Universities
Publisher
Association for Computing Machinery (ACM)
Reference55 articles.
1. Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. 2015. Neural machine translation by jointly learning to align and translate. In ICLR 2015: International Conference on Learning Representations 2015.
2. Self-supervised test-time learning for reading comprehension.;Banerjee Pratyay;arXiv preprint arXiv:2103.11263,2021
3. Multi-choice Relational Reasoning for Machine Reading Comprehension
4. ForceReader: a BERT-based Interactive Machine Reading Comprehension Model with Attention Separation
5. Smarnet: Teaching machines to read and comprehend like human.;Chen Zheqian;arXiv preprint arXiv:1710.02772,2017
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献