Author:
Wang Yong,Chen Linjun,Gao Cuiyun,Fang Yingtao,Li Yong
Funder
Key Project of Anhui Unversity Natural Science Foundation
National Natural Science Foundation of China
Anhui Province Scientific Research Planning Project
The University Synergy Innovation Program of Anhui Province
Key Project of Natural Science Research of Higher Education Institution of Anhui Province of China
Publisher
Springer Science and Business Media LLC
Reference43 articles.
1. Brown, T., Mann, B., Ryder, N., Subbiah, M., Kaplan, J.D., Dhariwal, P., Neelakantan, A., Shyam, P., Sastry, G., Askell, A., et al.: Language models are few-shot learners. Adv. Neural Inf. Process. Syst. 33, 1877–1901 (2020)
2. Chen, Y., Gao, C., Ren, X., Peng, Y., Xia, X., Lyu, M.R.: API usage recommendation via multi-view heterogeneous graph representation learning. IEEE Trans. Softw. Eng. 49(5), 3289–3304 (2023)
3. Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
4. Fowkes, J., Sutton, C.: Parameter-free probabilistic api mining across github. In: Proceedings of the 2016 24th ACM SIGSOFT International Symposium on Foundations of Software Engineering, pp. 254–265 (2016)
5. Gao, T., Fisch, A., Chen, D.: Making pre-trained language models better few-shot learners. arXiv preprint arXiv:2012.15723 (2020)