Subject
Artificial Intelligence,Information Systems and Management,Management Information Systems,Software
Reference76 articles.
1. J. Devlin, M.-W. Chang, K. Lee, K. Toutanova, BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, in: Proceedings of the 2019 NACCL, 2019, pp. 4171–4186.
2. Roberta: A robustly optimized bert pretraining approach;Liu,2019
3. C.-H. Chiang, S.-F. Huang, H.-Y. Lee, Pretrained Language Model Embryology: The Birth of ALBERT, in: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP, 2020, pp. 6813–6828.
4. X. Zhou, Y. Zhang, L. Cui, D. Huang, Evaluating commonsense in pre-trained language models, in: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, 2020, pp. 9733–9740.
5. Pre-trained models for natural language processing: A survey;Qiu;Sci. China Technol. Sci.,2020
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献