Digit recognition using decimal coding and artificial neural network
-
Published:2021-12-02
Issue:1
Volume:49
Page:
-
ISSN:2307-4108
-
Container-title:Kuwait Journal of Science
-
language:
-
Short-container-title:KJS publishes peer-review articles in Mathematics, Computer Science, Physics, Statistics, Biology, Chemistry, and Earth & Environmental Sciences.
Author:
Datsi Toufik, ,Aznag Khalid,Oirrak Ahmed El, ,
Abstract
Current artificial neural network image recognition techniques use all the pixels of an image as input. In this paper, we present an efficient method for handwritten digit recognition that involves extracting the characteristics of a digit image by coding each row of the image as a decimal value, i.e., by transforming the binary representation into a decimal value. This method is called the decimal coding of rows. The set of decimal values calculated from the initial image is arranged as a vector and normalized; these values represent the inputs to the artificial neural network. The approach proposed in this work uses a multilayer perceptron neural network for the classification, recognition, and prediction of handwritten digits from 0 to 9. In this study, a dataset of 1797 samples were obtained from a digit database imported from the Scikit-learn library. Backpropagation was used as a learning algorithm to train the multilayer perceptron neural network. The results show that the proposed approach achieves better performance than two other schemes in terms of recognition accuracy and execution time.
Publisher
Kuwait Journal of Science
Subject
Multidisciplinary