Abstract
Neural machine translation (NMT) is a fast-evolving MT paradigm and showed good results, particularly in large training data circumstances, for several language pairs. In this paper, we have utilized Sanskrit to Malayalam language pair neural machines translation. The attention-based mechanism for the development of the machine translation system was particularly exploited. Word sense disambiguation (WSD) is a phenomenon for disambiguating the text to let the machine infer the proper definition of the particular word. Sequential deep learning approaches such as a recurrent neural network (RNN), a gated recurrent unit (GRU), a long short term memory (LSTM), and a bi-directional LSTM (BLSTM) were used to analyze the tagged data. By adding morphological elements and evolutionary word sense disambiguation, the suggested common character-word embedding-based NMT model gives a BLEU score of 38.58 which was higher than the others.
Publisher
Institute of Advanced Engineering and Science
Subject
Electrical and Electronic Engineering,Control and Optimization,Computer Networks and Communications,Hardware and Architecture,Information Systems,Signal Processing
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献