Publisher
Springer Nature Singapore
Reference40 articles.
1. Niu, C., Li, C., Ng, V., et al.: Spt-code: sequence-to-sequence pre-training for learning the representation of source code. arXiv preprint arXiv:2201.01549 (2022)
2. Feng, Z., Guo, D., Tang, D., et al.: Codebert: a pre-trained model for programming and natural languages. arXiv preprint arXiv:2002.08155 (2020)
3. Guo, D., Ren, S., Lu, S., et al.: Graphcodebert: pre-training code representations with data flow. arXiv preprint arXiv:2009.08366 (2020)
4. Sun, F.K., Ho, C.H., Lee, H.Y.: Lamol: language modeling for lifelong language learning. arXiv preprint arXiv:1909.03329 (2019)
5. Schwartz, R., Dodge, J., Smith, N.A., et al.: Green ai. Commun. ACM 63(12), 54–63 (2020)