Funder
National Natural Science Foundation of China
Reference52 articles.
1. Using machine translation for converting Python 2 to Python 3 code;Aggarwal;PeerJ Prepr.,2015
2. Ahmad, W.U., Chakraborty, S., Ray, B., Chang, K., 2021. Unified Pre-training for Program Understanding and Generation. In: Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. NAACL-HLT, pp. 2655–2668.
3. Ahmad, W.U., Chakraborty, S., Ray, B., Chang, K., 2023. Summarize and Generate to Back-translate: Unsupervised Translation of Programming Languages. In: Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics. EACL, pp. 1528–1542.
4. Brown, T.B., Mann, B., Ryder, N., Subbiah, M., Kaplan, J., Dhariwal, P., et al., 2020. Language Models are Few-Shot Learners. In: Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems. NeurIPS.
5. Chada, R., Natarajan, P., 2021. FewshotQA: A simple framework for few-shot learning of question answering tasks using pre-trained text-to-text models. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing. EMNLP, pp. 6081–6090.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献