1. Language mod-els are few-shot learners;Brown;Advances in neural information processing systems,2020
2. Bert: Pre-training of deep bidirectional transformers for language understanding;Devlin;ar Xiv preprint,2018
3. Evaluating large language models trained on code;Chen;arXiv preprint,2021
4. Code completion with neural attention and pointer networks;Li;arXiv preprint,2017