Funder
National Natural Science Foundation of China
Science and Technology Planning Project of Shenzhen Municipality
Development and Reform Commission of Shenzhen Municipality
Publisher
Springer Science and Business Media LLC
Reference28 articles.
1. Devlin J, Chang M W, Lee K, Toutanova K (2019) Bert: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 conference of the North American chapter of the association for computational linguistics: Human language technologies, Vol 1 (Long and Short Papers), pp 4171–4186
2. Du X, Cardie C (2018) Harvesting paragraph-level question-answer pairs from wikipedia. In: Proceedings of the 56th annual meeting of the association for computational linguistics (Volume 1: Long Papers), pp 1907–1917
3. Du X, Shao J, Cardie C (2017) Learning to ask: Neural question generation for reading comprehension. In: Proceedings of the 55th annual meeting of the association for computational linguistics (Volume 1: Long Papers), pp 1342–1352
4. Gong Y, Bowman S (2018) Ruminating reader: Reasoning with gated multi-hop attention. In: Proceedings of the workshop on machine reading for question answering, pp 1–11
5. Gu J, Lu Z, Li H, Li VO (2016) Incorporating copying mechanism in sequence-to-sequence learning. In: Proceedings of the 54th annual meeting of the association for computational linguistics (Volume 1: Long Papers), pp 1631–1640
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献