Neural Machine Translation of Electrical Engineering Based on Integrated Convolutional Neural Networks
-
Published:2023-08-25
Issue:17
Volume:12
Page:3604
-
ISSN:2079-9292
-
Container-title:Electronics
-
language:en
-
Short-container-title:Electronics
Author:
Liu Zikang12, Chen Yuan3, Zhang Juwei12
Affiliation:
1. School of Information Engineering, Henan University of Science and Technology, Luoyang 471023, China 2. Henan Province New Energy Vehicle Power Electronics and Power Transmission Engineering Research Center, Luoyang 471023, China 3. School of Foreign Languages, Henan University of Science and Technology, Luoyang 471023, China
Abstract
Research has shown that neural machine translation performs poorly on low-resource and specific domain parallel corpora. In this paper, we focus on the problem of neural machine translation in the field of electrical engineering. To address the mistranslation caused by the Transformer model’s limited ability to extract feature information from certain sentences, we propose two new models that integrate a convolutional neural network as a feature extraction layer into the Transformer model. The feature information extracted by the CNN is fused separately in the source-side and target-side models, which enhances the Transformer model’s ability to extract feature information, optimizes model performance, and improves translation quality. On the dataset of the field of electrical engineering, the proposed source-side and target-side models improved BLEU scores by 1.63 and 1.12 percentage points, respectively, compared to the baseline model. In addition, the two models proposed in this paper can learn rich semantic knowledge without relying on auxiliary knowledge such as part-of-speech tagging and named entity recognition, which saves a certain amount of human resources and time costs.
Subject
Electrical and Electronic Engineering,Computer Networks and Communications,Hardware and Architecture,Signal Processing,Control and Systems Engineering
Reference26 articles.
1. Bahdanau, D., Cho, K., and Bengio, Y. (2014). Neural machine translation by jointly learning to align and translate. arXiv. 2. Gehring, J., Auli, M., Grangier, D., Yarats, D., and Dauphin, Y.N. (2017, January 6–17). Convolutional sequence to sequence learning. Proceedings of the International Conference on Machine Learning, Sydney, Australia. 3. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł., and Polosukhin, I. (2017). Attention is all you need. arXiv. 4. Tonja, A.L., Kolesnikova, O., Gelbukh, A., and Sidorov, G. (2023). Low-Resource Neural Machine Translation Improvement Using Source-Side Monolingual Data. Appl. Sci., 13. 5. Mahsuli, M.M., Khadivi, S., and Homayounpour, M.M. (2023). LenM: Improving Low-Resource Neural Machine Translation Using Target Length Modeling. Neural Process. Lett., 1–32.
|
|