Affiliation:
1. Institute of Information and Computing Technologies CS MES RK, Almaty, KAZAKHSTAN
Abstract
This article analyzes the most famous sign languages, the correlation of sign languages, and also considers the development of a verbal robot hand gesture recognition system in relation to the Kazakh language. The proposed system contains a touch sensor, in which the contact of the electrical property of the user's skin is measured, which provides more accurate information for simulating and indicating the gestures of the robot hand. Within the framework of the system, the speed and accuracy of recognition of each gesture of the verbal robot are calculated. The average recognition accuracy was over 98%. The detection time was 3ms on a 1.9 GHz Jetson Nano processor, which is enough to create a robot showing natural language gestures. A complete fingerprint of the Kazakh sign language for a verbal robot is also proposed. To improve the quality of gesture recognition, a machine learning method was used. The operability of the developed technique for recognizing gestures by a verbal robot was tested, and on the basis of computational experiments, the effectiveness of algorithms and software for responding to a verbal robot to a voice command was evaluated based on automatic recognition of a multilingual human voice. Thus, we can assume that the authors have proposed an intelligent verbal complex implemented in Python with the CMUSphinx communication module and the PyOpenGL graphical command execution simulator. Robot manipulation module based on 3D modeling from ABB.
Publisher
World Scientific and Engineering Academy and Society (WSEAS)
Subject
Artificial Intelligence,General Mathematics,Control and Systems Engineering
Reference27 articles.
1. Maung, T. H. H. 2009. Real-Time Hand Tracking and Gesture Recognition System Using Neural Networks. Proceedings of World Academy of Science: Engineering & Technology, 50, 466- 470.
2. Mitra, S. & Acharya, T. 2007. Gesture recognition: A Survey. IEEE Transactions on Systems, Man and Cybernetics. IEEE
3. Bourennane, S. & Fossati, C. 2010. Comparison of shape descriptors for hand posture recognition in video. Signal, Image and Video Processing, 6, 147-157.
4. Yoon, J.-H., Park, J.-S. & Sung, M. Y. VisionBased bare-hand gesture interface for interactive augmented reality applications. 5th international conference on Entertainment Computing, September 20-22 2006 Cambridge, UK. 2092520: SpringerVerlag, 386-389
5. Buchmann, V., Violich, S., Billinghurst, M. & Cockburn, A. 2004. Fingartips: Gesture Based Direct manipulation in Augmented Reality. 2nd international Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia. Singapore: ACM
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. WiFi and BLE Fingerprinting for Smartphone Proximity Detection;2022 6th European Conference on Electrical Engineering & Computer Science (ELECS);2022-12