Affiliation:
1. Chinese Academy of Sciences & University of Chinese Academy of Sciences, Beijing, China
Funder
National Key R&D Program of China
Youth Innovation Promotion Association CAS
Foundation and Frontier Research Key Program of Chongqing Science and Technology Commission
Beijing Academy of Artificial Intelligence (BAAI)
National Natural Science Foundation of China (NSFC)
Reference47 articles.
1. Danqi Chen. 2018. Neural Reading Comprehension and Beyond. Ph.D. Dissertation. Stanford University. Danqi Chen. 2018. Neural Reading Comprehension and Beyond. Ph.D. Dissertation. Stanford University.
2. Christopher Clark and Matt Gardner. 2017. Simple and effective multi-paragraph reading comprehension. arXiv preprint arXiv:1710.10723 (2017). Christopher Clark and Matt Gardner. 2017. Simple and effective multi-paragraph reading comprehension. arXiv preprint arXiv:1710.10723 (2017).
3. From Credit Assignment to Entropy Regularization: Two New Algorithms for Neural Sequence Prediction
4. Maha Elbayad Laurent Besacier and Jakob Verbeek. 2018. Token-level and sequence-level loss smoothing for RNN language models. arXiv preprint arXiv:1805.05062 (2018). Maha Elbayad Laurent Besacier and Jakob Verbeek. 2018. Token-level and sequence-level loss smoothing for RNN language models. arXiv preprint arXiv:1805.05062 (2018).
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献