1. COMET: A Neural Framework for MT Evaluation
2. Pawar, S. , Tonmoy, S.M.T.I. , Zaman, S.M.M. , Jain, V. , Chadha, A. and Das, A. (2024). The what, why, and how of context length extension techniques in large language models – A detailed survey.
3. Domain Control for Neural Machine Translation
4. GEMBA-MQM: Detecting Translation Quality Error Spans with GPT-4
5. Song, K. , Zhang, Y. , Yu, H. , Luo, W. , Wang, K. and Zhang, M. (2019). Code-switching for enhancing NMT with pre-specified translation. In Burstein, J. , Doran, C. and Solorio, T. (eds), Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), Minneapolis, Minnesota. Association for Computational Linguistics, pp. 449–459.