Subject
Artificial Intelligence,Information Systems and Management,Management Information Systems,Software
Reference53 articles.
1. B.Y. Lin, W. Zhou, M. Shen, P. Zhou, C. Bhagavatula, Y. Choi, X. Ren, CommonGen: A Constrained Text Generation Challenge for Generative Commonsense Reasoning, in: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings, 2020, pp. 1823–1840, http://dx.doi.org/10.18653/v1/2020.findings-emnlp.165.
2. Improving language understanding by generative pre-training (2018);Radford,2018
3. Language models are unsupervised multitask learners;Radford;OpenAI Blog,2019
4. M. Lewis, Y. Liu, N. Goyal, M. Ghazvininejad, A. Mohamed, O. Levy, V. Stoyanov, L. Zettlemoyer, BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension, in: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020, pp. 7871–7880, http://dx.doi.org/10.18653/v1/2020.acl-main.703.
5. Exploring the limits of transfer learning with a unified text-to-text transformer;Raffel;J. Mach. Learn. Res.,2020
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献