Identification and Correction of Grammatical Errors in Ukrainian Texts Based on Machine Learning Technology
-
Published:2023-02-10
Issue:4
Volume:11
Page:904
-
ISSN:2227-7390
-
Container-title:Mathematics
-
language:en
-
Short-container-title:Mathematics
Author:
Lytvyn Vasyl1, Pukach Petro2ORCID, Vysotska Victoria13ORCID, Vovk Myroslava2ORCID, Kholodna Nataliia1
Affiliation:
1. Information Systems and Networks Department, Lviv Polytechnic National University, 12 Bandera Str., 79013 Lviv, Ukraine 2. Institute of Applied Mathematics and Fundamental Sciences, Lviv Polytechnic National University, 12 Bandera Str., 79013 Lviv, Ukraine 3. Institute of Computer Science, Osnabrück University, 1 Friedrich-Janssen-Str., 49076 Osnabrück, Germany
Abstract
A machine learning model for correcting errors in Ukrainian texts has been developed. It was established that the neural network has the ability to correct simple sentences written in Ukrainian; however, the development of a full-fledged system requires the use of spell-checking using dictionaries and the checking of rules, both simple and those based on the result of parsing dependencies or other features. In order to save computing resources, a pre-trained BERT (Bidirectional Encoder Representations from Transformer) type neural network was used. Such neural networks have half as many parameters as other pre-trained models and show satisfactory results in correcting grammatical and stylistic errors. Among the ready-made neural network models, the pre-trained neural network model mT5 (a multilingual variant of T5 or Text-to-Text Transfer Transformer) showed the best performance according to the BLEU (bilingual evaluation understudy) and METEOR (metric for evaluation of translation with explicit ordering) metrics.
Subject
General Mathematics,Engineering (miscellaneous),Computer Science (miscellaneous)
Reference50 articles.
1. Naghshnejad, M., Joshi, T., and Nair, V.N. (2020). Recent Trends in the Use of Deep Learning Models for Grammar Error Handling. arXiv. 2. Leacock, C., Chodorow, M., Gamon, M., and Tetreault, J. (2014). Automated Grammatical Error Detection for Language Learners. Synthesis Lectures on Human Language Technologies, Springer. [2nd ed.]. 3. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, L., and Polosukhin, I. (2017). Attention Is All You Need. arXiv. 4. Wolf, T., Debut, L., Sanh, V., Chaumond, J., Delangue, C., Moi, A., Cistac, P., Rault, T., Louf, R., and Funtowicz, M. (2020, January 16–20). Transformers: State-of-the-Art Natural Language Processing. Proceedings of the Conference on Empirical Methods in Natural Language Processing: System Demonstrations, ACL Anthology, Online. 5. Devlin, J., Chang, M.-W., Lee, K., and Toutanova, K. (2019, January 2–7). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Minneapolis, MN, USA.
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|