Funder
National Natural Science Foundation of China
National Key Research and Development Program of China
Major Science and Technology Project of Hainan Province
Reference51 articles.
1. Longformer: The long-document transformer;Beltagy,2020
2. Dynamic neuro-symbolic knowledge graph construction for zero-shot commonsense question answering;Bosselut;Proc. AAAI Conf. Artif. Intell.,2021
3. Boyd-Graber, J., Börschinger, B., 2020. What question answering can learn from trivia nerds. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. pp. 7422–7435. http://dx.doi.org/10.18653/v1/2020.acl-main.662.
4. Language models are few-shot learners;Brown,2020
5. Cao, Q., Trivedi, H., Balasubramanian, A., Balasubramanian, N., 2020. DeFormer: Decomposing pre-trained transformers for faster question answering. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. pp. 4487–4497. http://dx.doi.org/10.18653/v1/2020.acl-main.411.