Abstract
Grammatical error correction (GEC) is one of the major tasks in natural language processing (NLP) which has recently attracted great attention from researchers. The performance of universal languages such as English and Chinese in the GEC system has improved significantly. This could be attributed to the large number of powerful applications supported by neural network models and pretrained language models. Referring to the satisfactory results of the universal language in the GEC task and the lack of research on the GEC task for low-resource languages, especially Indonesian, this paper proposes an automatic model for Indonesian grammar correction based on the Transformer architecture which can be applied to other low-resource language texts. Furthermore, we build a large corpus of the Indonesian language that can be utilized for evaluating the next Indonesian GEC task. We evaluate the models in this dataset, and the results show that the Transformer-based automatic error correction model achieved significant and satisfactory results compared with the results of previous research models.
Funder
Guangzhou Key Area R&D Program
Subject
Fluid Flow and Transfer Processes,Computer Science Applications,Process Chemistry and Technology,General Engineering,Instrumentation,General Materials Science
Reference42 articles.
1. Proposed model for arabic grammar error correction based on convolutional neural network;Solyman;Proceedings of the 2019 International Conference on Computer, Control, Electrical, and Electronics Engineering (ICCCEEE),2019
2. Massive Exploration of Pseudo Data for Grammatical Error Correction
3. Near Human-Level Performance in Grammatical Error Correction with
Hybrid Machine Translation
4. A Framework for Indonesian Grammar Error Correction
5. BERT Multilingual and Capsule Network for Arabic Sentiment Analysis;Obied;Proceedings of the 2020 International Conference On Computer, Control, Electrical, And Electronics Engineering
(ICCCEEE),2020
Cited by
10 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献