Sequence-to-sequence neural machine translation for English-Malay
-
Published:2022-06-01
Issue:2
Volume:11
Page:658
-
ISSN:2252-8938
-
Container-title:IAES International Journal of Artificial Intelligence (IJ-AI)
-
language:
-
Short-container-title:IJ-AI
Author:
Phua Yeong Tsann,Navaratnam Sujata,Kang Chon-Moy,Che Wai-Seong
Abstract
Machine translation aims to translate text from a specific language into another language using computer software. In this work, we performed neural machine translation with attention implementation on English-Malay parallel corpus. We attempt to improve the model performance by rectified linear unit (ReLU) attention alignment. Different sequence-to-sequence models were trained. These models include long-short term memory (LSTM), gated recurrent unit (GRU), bidirectional LSTM (Bi-LSTM) and bidirectional GRU (Bi-GRU). In the experiment, both bidirectional models, Bi-LSTM and Bi-GRU yield a converge of below 30 epochs. Our study shows that the ReLU attention alignment improves the bilingual evaluation understudy (BLEU) translation score between score 0.26 and 1.12 across all the models as compare to the original Tanh models.
Publisher
Institute of Advanced Engineering and Science
Subject
Electrical and Electronic Engineering,Artificial Intelligence,Information Systems and Management,Control and Systems Engineering
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献