1. Anikin, A., Kultsova, M., Zhukova, I., Sadovnikova, N., Litovkin, D.: Knowledge based models and software tools for learning management in open learning network. In: Kravets, A., Shcherbakov, M., Kultsova, M., Iijima, T. (eds.) Knowledge-Based Software Engineering. JCKBSE 2014. Communications in Computer and Information Science, vol. 466, pp. 156–171. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-11854-3_15
2. Advances in Intelligent Systems and Computing;A Anikin,2019
3. Conneau, A., et al.: Unsupervised cross-lingual representation learning at scale. arXiv preprint arXiv:1911.02116 (2019)
4. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). pp. 4171–4186 (2019)
5. Karpukhin, V., Baranchukov, A., Burtsev, M., Tsetlin, Y., Gusev, G.: Rugpt-3: large-scale Russian language models with few-shot learning capabilities. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP) (2021)