Smart Home Automation-Based Hand Gesture Recognition Using Feature Fusion and Recurrent Neural Network
Author:
Alabdullah Bayan Ibrahimm1, Ansar Hira2, Mudawi Naif Al3ORCID, Alazeb Abdulwahab3, Alshahrani Abdullah4, Alotaibi Saud S.5ORCID, Jalal Ahmad2
Affiliation:
1. Department of Information Systems, College of Computer and Information Sciences, Princess Nourah bint Abdulrahman University, P.O. Box 84428, Riyadh 11671, Saudi Arabia 2. Department of Computer Science, Air University, E-9, Islamabad 44000, Pakistan 3. Department of Computer Science, College of Computer Science and Information System, Najran University, Najran 55461, Saudi Arabia 4. Department of Computer Science and Artificial Intelligence, College of Computer Science and Engineering, University of Jeddah, Jeddah 21589, Saudi Arabia 5. Information Systems Department, Umm Al-Qura University, Makkah 24382, Saudi Arabia
Abstract
Gestures have been used for nonverbal communication for a long time, but human–computer interaction (HCI) via gestures is becoming more common in the modern era. To obtain a greater recognition rate, the traditional interface comprises various devices, such as gloves, physical controllers, and markers. This study provides a new markerless technique for obtaining gestures without the need for any barriers or pricey hardware. In this paper, dynamic gestures are first converted into frames. The noise is removed, and intensity is adjusted for feature extraction. The hand gesture is first detected through the images, and the skeleton is computed through mathematical computations. From the skeleton, the features are extracted; these features include joint color cloud, neural gas, and directional active model. After that, the features are optimized, and a selective feature set is passed through the classifier recurrent neural network (RNN) to obtain the classification results with higher accuracy. The proposed model is experimentally assessed and trained over three datasets: HaGRI, Egogesture, and Jester. The experimental results for the three datasets provided improved results based on classification, and the proposed system achieved an accuracy of 92.57% over HaGRI, 91.86% over Egogesture, and 91.57% over the Jester dataset, respectively. Also, to check the model liability, the proposed method was tested on the WLASL dataset, attaining 90.43% accuracy. This paper also includes a comparison with other-state-of-the art methods to compare our model with the standard methods of recognition. Our model presented a higher accuracy rate with a markerless approach to save money and time for classifying the gestures for better interaction.
Funder
Princess Nourah bint Abdulrahman University Researchers Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia Najran University
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Reference79 articles.
1. Panwar, M., and Mehra, P.S. (2011, January 3–5). Hand gesture recognition for human computer interaction. Proceedings of the IEEE 2011 International Conference on Image Information Processing, Shimla, India. 2. Hand gesture recognition: A literature review;Khan;Int. J. Artif. Intell. Appl.,2012 3. Wu, C.H., and Lin, C.H. (2013, January 3–6). Depth-based hand gesture recognition for home appliance control. Proceedings of the 2013 IEEE International Symposium on Consumer Electronics (ISCE), Hsinchu, Taiwan. 4. Solanki, U.V., and Desai, N.H. (2011, January 11–14). Hand gesture based remote control for home appliances: Handmote. Proceedings of the 2011 IEEE World Congress on Information and Communication Technologies, Mumbai, India. 5. Hsieh, C.C., Liou, D.H., and Lee, D. (2010, January 5–7). A real time hand gesture recognition system using motion history image. Proceedings of the IEEE 2010 2nd International Conference on Signal Processing Systems, Dalian, China.
|
|