1. Creative Commons license - Wikipedia. https://en.wikipedia.org/wiki/Creative_Commons_license. Accessed 1 Apr 2022
2. Stack Overflow - Wikipedia. https://en.wikipedia.org/wiki/Stack_Overflow. Accessed 4 Apr 2022
3. Ashish, V., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)
4. Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
5. Liu, Y., et al.: RoBERTa: a robustly optimized BERT pretraining approach. arXiv preprint arXiv:1907.11692 (2019)