1. Acunetix. 2021. Spring 2021 Edition: Acunetix Web Vulnerability Report. Retrieved January 15, 2024, from https://www.acunetix.com/white-papers/acunetix-web-application-vulnerability-report-2021/.
2. Kaspersky. 2021. PHP language source code compromise attempt. Retrieved January 15, 2024, from https://www.kaspersky.com/blog/php-git-backdor/39191/.
3. Feng Z. Guo D. Tang D. Duan N. Feng X. Gong M. Shou L. Qin B. Liu T. and Jiang D. 2020. CodeBERT: A Pre-Trained Model for Programming and Natural Languages. arXiv preprint arXiv:2002.08155.
4. Devlin, J., Chang, M.-W., Lee, K., and Toutanova, K. 2018. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv preprint arXiv:1810.04805.
5. Liu Y. Ott M. Goyal N. Du J. Joshi M. Chen D. Levy O. Lewis M. Zettlemoyer L. and Stoyanov V. 2019. RoBERTa: A Robustly Optimized BERT Pretraining Approach. arXiv preprint arXiv:1907.11692.