Affiliation:
1. School of Computer Science, Wuhan University, China
2. Department of Computer Science and Engineering, The Hong Kong University of Science and Technology, China
Abstract
With the wide application of machine translation, the testing of Machine Translation Systems (MTSs) has attracted much attention. Recent works apply Metamorphic Testing (MT) to address the oracle problem in MTS testing. Existing MT methods for MTS generally follow the workflow of input transformation and output relation comparison, which generates a follow-up input sentence by mutating the source input and compares the source and follow-up output translations to detect translation errors, respectively. These methods use various input transformations to generate the test case pairs and have successfully triggered numerous translation errors. However, they have limitations in performing fine-grained and rigorous output relation comparison and thus may report many false alarms and miss many true errors. In this paper, we propose a word closure-based output comparison method to address the limitations of the existing MTS MT methods. We first propose word closure as a new comparison unit, where each closure includes a group of correlated input and output words in the test case pair. Word closures suggest the linkages between the appropriate fragment in the source output translation and its counterpart in the follow-up output for comparison. Next, we compare the semantics on the level of word closure to identify the translation errors. In this way, we perform a fine-grained and rigorous semantic comparison for the outputs and thus realize more effective violation identification. We evaluate our method with the test cases generated by five existing input transformations and the translation outputs from three popular MTSs. Results show that our method significantly outperforms the existing works in violation identification by improving the precision and recall and achieving an average increase of 29.9% in F1 score. It also helps to increase the F1 score of translation error localization by 35.9%.
Publisher
Association for Computing Machinery (ACM)
Reference69 articles.
1. 2023. Bing Microsoft Translator. https://www.bing.com/translator.
2. 2023. Dataset replication package and supplementary material for this paper. https://github.com/imjinshuo/Word-Closure-Based-MT.
3. 2023. Google Translate. https://translate.google.com/.
4. 2023. Youdao Translate. https://translate.google.com/.
5. BiasFinder: Metamorphic Test Generation to Uncover Bias for Sentiment Analysis Systems