Neural Machine Translation for Low-resource Languages: A Survey

Author:

Ranathunga Surangika1ORCID,Lee En-Shiun Annie2ORCID,Prifti Skenduli Marjana3ORCID,Shekhar Ravi4ORCID,Alam Mehreen5ORCID,Kaur Rishemjit6ORCID

Affiliation:

1. University of Moratuwa, Katubedda, Sri Lanka

2. University of Toronto, Toronto, Canada

3. University of New York Tirana, Tirana, Albania

4. Queen Mary University of London, London, UK

5. National University of Computer and Emerging Sciences, Pakistan

6. CSIR-Central Scientific Instruments Organisation, Chandigarh, India

Abstract

Neural Machine Translation (NMT) has seen tremendous growth in the last ten years since the early 2000s and has already entered a mature phase. While considered the most widely used solution for Machine Translation, its performance on low-resource language pairs remains sub-optimal compared to the high-resource counterparts due to the unavailability of large parallel corpora. Therefore, the implementation of NMT techniques for low-resource language pairs has been receiving the spotlight recently, thus leading to substantial research on this topic. This article presents a detailed survey of research advancements in low-resource language NMT (LRL-NMT) and quantitative analysis to identify the most popular techniques. We provide guidelines to select the possible NMT technique for a given LRL data setting based on our findings. We also present a holistic view of the LRL-NMT research landscape and provide recommendations to enhance the research efforts further.

Publisher

Association for Computing Machinery (ACM)

Subject

General Computer Science,Theoretical Computer Science

Reference217 articles.

1. Mostafa Abdou, Vladan Glončák, and Ondřej Bojar. 2017. Variable mini-batch sizing and pre-trained embeddings. In Proceedings of the 2nd Conference on Machine Translation. 680–686.

2. Haluk Açarçiçek, Talha Çolakoğlu, Pınar Ece Aktan Hatipoğlu, Chong Hsuan Huang, and Wei Peng. 2020. Filtering noisy parallel corpus using transformers with proxy task learning. In Proceedings of the 5th Conference on Machine Translation. 940–946.

3. Roee Aharoni, Melvin Johnson, and Orhan Firat. 2019. Massively multilingual neural machine translation. In Proceedings of the North American Chapter of the Association for Computational Linguistics (NAACL’19). 3874–3884.

4. Alham Fikri Aji, Nikolay Bogoychev, Kenneth Heafield, and Rico Sennrich. 2020. In neural machine translation, what does transfer learning transfer? In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (ACL’20). 7701–7710.

5. Nabeel T. Alsohybe Neama Abdulaziz Dahan and Fadl Mutaher Ba-Alwi. 2017. Machine-translation history and evolution: Survey for Arabic-English translations. arXiv preprint arXiv:1709.04685 (2017).

Cited by 29 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3