1. What are developers talking about? An analysis of topics and trends in Stack Overflow
2. Iz Beltagy Kyle Lo and Arman Cohan. 2019. SciBERT: A Pretrained Language Model for Scientific Text. arXiv:1903.10676 [cs.CL] Iz Beltagy Kyle Lo and Arman Cohan. 2019. SciBERT: A Pretrained Language Model for Scientific Text. arXiv:1903.10676 [cs.CL]
3. A large annotated corpus for learning natural language inference
4. Luca Buratti Saurabh Pujar Mihaela Bornea Scott McCarley Yunhui Zheng Gaetano Rossiello Alessandro Morari Jim Laredo Veronika Thost Yufan Zhuang etal 2020. Exploring software naturalness through neural language models. arXiv preprint arXiv:2006.12641 (2020). Luca Buratti Saurabh Pujar Mihaela Bornea Scott McCarley Yunhui Zheng Gaetano Rossiello Alessandro Morari Jim Laredo Veronika Thost Yufan Zhuang et al. 2020. Exploring software naturalness through neural language models. arXiv preprint arXiv:2006.12641 (2020).
5. Jacob Devlin , Ming-Wei Chang , Kenton Lee , and Kristina Toutanova . 2018 . BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. CoRR abs/1810.04805 (2018). arXiv:1810.04805 http://arxiv.org/abs/1810.04805 Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. CoRR abs/1810.04805 (2018). arXiv:1810.04805 http://arxiv.org/abs/1810.04805