1. Kalchbrenner, N. and Blunsom, P., Recurrent continuous translation models, in Proceedings of the ACL Conference on Empirical Methods in Natural Language Processing (EMNLP), Association for Computational Linguistics, pp. 1700–1709.
2. Klein, G., Hernandez, F., Nguyen, V., and Senellart, J., The OpenNMT Neural Machine Translation Toolkit, 2020 Ed., Proceedings of the 14th Conference of the Association for Machine Translation in the Americas, Vol. 1: MT Research Track, 2020, pp. 102–109.
3. Li, X., Liu, L., Tu, Z., Li, G., Shi, S., and Meng, M.Q.H., Attending from foresight: A novel attention mechanism for neural machine translation, in IEEE/ACM Transactions on Audio, Speech, and Language Processing, 2021, vol. 29, pp. 2606–2616: https://doi.org/10.1109/TASLP. 2021.3097939
4. Rubino, R., Marie, B., Dabre, R., et al., Extremely low-resource neural machine translation for Asian languages, Mach. Transl., 2021, vol. 34, pp. 347–382. https://doi.org/10.1007/s10590-020-09258-6
5. Bahdanau D., Cho K., and Bengio, Y., Neural machine translation by jointly learning to align and translate, 3rd International Conference on Learning Representations (ICLR), San Diego, USA, 2015. https://doi.org/10.48550/arXiv.1409.0473