1. Wasi Uddin Ahmad , Saikat Chakraborty , Baishakhi Ray , and Kai-Wei Chang . 2021. Unified Pre-training for Program Understanding and Generation. arXiv preprint arXiv:2103.06333 ( 2021 ). Wasi Uddin Ahmad, Saikat Chakraborty, Baishakhi Ray, and Kai-Wei Chang. 2021. Unified Pre-training for Program Understanding and Generation. arXiv preprint arXiv:2103.06333 (2021).
2. Kyunghyun Cho , Bart Van Merriënboer , Dzmitry Bahdanau , and Yoshua Bengio . 2014. On the properties of neural machine translation: Encoder-decoder approaches. arXiv preprint arXiv:1409.1259 ( 2014 ). Kyunghyun Cho, Bart Van Merriënboer, Dzmitry Bahdanau, and Yoshua Bengio. 2014. On the properties of neural machine translation: Encoder-decoder approaches. arXiv preprint arXiv:1409.1259 (2014).
3. An Empirical Study on the Usage of Transformer Models for Code Completion
4. Jacob Devlin , Ming-Wei Chang , Kenton Lee , and Kristina Toutanova . 2018 . Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018). Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018).
5. Yangruibo Ding , Luca Buratti , Saurabh Pujar , Alessandro Morari , Baishakhi Ray , and Saikat Chakraborty . 2021. Contrastive Learning for Source Code with Structural and Functional Properties. arXiv preprint arXiv:2110.03868 ( 2021 ). Yangruibo Ding, Luca Buratti, Saurabh Pujar, Alessandro Morari, Baishakhi Ray, and Saikat Chakraborty. 2021. Contrastive Learning for Source Code with Structural and Functional Properties. arXiv preprint arXiv:2110.03868 (2021).