Author:
Liu Ding,Li Yachao,Zhu Dengyun,Liu Xuan,Ma Ning,Zhu Ao
Abstract
Abstract
In recent years, the proposal of neural network has provided new idea for solving natural language processing, and at the same time, neural machine translation has become the frontier method of machine translation. In low-resource languages, due to the sparse bilingual data, the model needs more high-quality data, and the translation quality fails to achieve the desired effect. In this paper, experiments on neural network machine translation based on attention are conducted on Tibetan-Chinese language pairs, and transfer learning method combined with back translation method is used to alleviate the problem of insufficient Tibetan-Chinese parallel corpus. Experimental results show that the proposed transfer learning combined with back translation method is simple and effective. Compared with traditional translation methods, the translation effect is significantly improved. From the analysis of translation, it can be seen that the citation of Tibetan-Chinese neural machine translation is smoother, which is greatly improved compared to the translation without back translation. At the same time, there are common deficiencies in neural machine translation such as inadequate translation and low translation loyalty.
Subject
General Physics and Astronomy
Reference20 articles.
1. Application of neural network language model in statistical machine translation [J];Zhang;Information Engineering,2017
2. Advances in the frontiers of neural machine translation [J];Liu;Computer Research and Development,2017
3. Asynchronous translations with recurrent neural nets [C];Neco,1997
4. Deep neural networks in machine translation: An overview [J];Zhang;IEEE Intelligent Systems,2015
5. Sequence to sequence learning with neural networks [C];Sutskever,2014
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Using Machine learning Techniques to Improve the Accuracy and Fluency of Japanese Translation;2023 International Conference on Internet of Things, Robotics and Distributed Computing (ICIRDC);2023-12-29