1. Bird, S., Klein, E., and Loper, E. (2009). Natural language processing with Python: analyzing text with the natural language toolkit. Sebastopol, CA: O’Reilly Media, Inc, 509 p.
2. Chen, T., and Guestrin, C. (2016). XGBoost: A Scalable Tree Boosting System. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pages 785–794, San Francisco, CA. Accessible at: http://doi.acm.org/10.1145/2939672.2939785.
3. Devlin, J., Chang, M. W., Lee, K., and Toutanova, K. (2019). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 4171–4186, Minneapolis, Minnesota. Accessible at: http://dx.doi.org/10.18653/v1/N19-1423.
4. Dorogush, A., Ershov, V., and Gulin, A. (2018). CatBoost: gradient boosting with categorical features support. ArXiV preprint, 7 p. Accessible at: https://arxiv.org/abs/1810.11363.
5. Fares, M., Kutuzov, A., Oepen, S., and Velldal, E. (2017). Word vectors, reuse, and replicability: Towards a community repository of large-text resources. In Proceedings of the 21st Nordic Conference on Computational Linguistics, pages 271–276, Gothenburg, Sweden. Accessible at: https://aclanthology.org/W17-0237/.