Affiliation:
1. ANLP Research Group, MIRACL Lab., University of Sfax, Tunisia
2. ANLP Research Group, MIRACL Lab., ISIMa, University of Monastir, Tunisia
Abstract
Tokenization represents the way of segmenting a piece of text into smaller units called tokens. Since Arabic is an agglutinating language by nature, this treatment becomes a crucial preprocessing step for many Natural Language Processing (NLP) applications such as morphological analysis, parsing, machine translation, information extraction, and so on. In this article, we investigate word tokenization task with a rewriting process to rewrite the orthography of the stem. For this task, we are using Tunisian Arabic (TA) text. To the best of the researchers’ knowledge, this is the first study that uses TA for word tokenization. Therefore, we start by collecting and preparing various TA corpora from different sources. Then, we present a comparison of three character-based tokenizers based on Conditional Random Fields (CRF), Support Vector Machines (SVM) and Deep Neural Networks (DNN). The best proposed model using CRF achieved an F-measure result of 88.9%.
Publisher
Association for Computing Machinery (ACM)
Reference51 articles.
1. Farasa: A Fast and Furious Segmenter for Arabic
2. Muhammad Abdul-Mageed, Mona Diab, and Sandra Kübler. 2013. ASMA: A system for automatic segmentation and morpho-syntactic disambiguation of modern standard Arabic. In Proceedings of the International Conference Recent Advances in Natural Language Processing RANLP 2013. INCOMA Ltd. Shoumen, BULGARIA, Hissar, Bulgaria, 1–8.
3. Igor N. Aizenberg, Naum N. Aizenberg, and Joos Vandewalle. 2000. Multiple-valued threshold logic and multi-valued neurons. In Proceedings of the Multi-Valued and Universal Binary Neurons. Springer, 25–80.
4. A rule-based approach for tagging non-vocalized Arabic words;Al-taani Ahmad;The International Arab Journal of Information Technology,2009
5. Arabic Word Segmentation With Long Short-Term Memory Neural Networks and Word Embedding
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献