MEMS Devices-Based Hand Gesture Recognition via Wearable Computing

Author:

Wang Huihui1,Ru Bo2,Miao Xin2,Gao Qin3,Habib Masood2ORCID,Liu Long12ORCID,Qiu Sen2ORCID

Affiliation:

1. School of Intelligence and Electronic Engineering, Dalian Neusoft University of Information, Dalian 116023, China

2. Key Laboratory of Intelligent Control and Optimization for Industrial Equipment of Ministry of Education, Dalian University of Technology, Dalian 116024, China

3. College of Aeronautical Engineering, Taizhou University, Taizhou 318000, China

Abstract

Gesture recognition has found widespread applications in various fields, such as virtual reality, medical diagnosis, and robot interaction. The existing mainstream gesture-recognition methods are primarily divided into two categories: inertial-sensor-based and camera-vision-based methods. However, optical detection still has limitations such as reflection and occlusion. In this paper, we investigate static and dynamic gesture-recognition methods based on miniature inertial sensors. Hand-gesture data are obtained through a data glove and preprocessed using Butterworth low-pass filtering and normalization algorithms. Magnetometer correction is performed using ellipsoidal fitting methods. An auxiliary segmentation algorithm is employed to segment the gesture data, and a gesture dataset is constructed. For static gesture recognition, we focus on four machine learning algorithms, namely support vector machine (SVM), backpropagation neural network (BP), decision tree (DT), and random forest (RF). We evaluate the model prediction performance through cross-validation comparison. For dynamic gesture recognition, we investigate the recognition of 10 dynamic gestures using Hidden Markov Models (HMM) and Attention-Biased Mechanisms for Bidirectional Long- and Short-Term Memory Neural Network Models (Attention-BiLSTM). We analyze the differences in accuracy for complex dynamic gesture recognition with different feature datasets and compare them with the prediction results of the traditional long- and short-term memory neural network model (LSTM). Experimental results demonstrate that the random forest algorithm achieves the highest recognition accuracy and shortest recognition time for static gestures. Moreover, the addition of the attention mechanism significantly improves the recognition accuracy of the LSTM model for dynamic gestures, with a prediction accuracy of 98.3%, based on the original six-axis dataset.

Funder

National Natural Science Foundation of China

Natural Science Foundation of Liaoning Province, China

Fundamental Research Funds for the Central Universities, China

Taizhou University

Publisher

MDPI AG

Subject

Electrical and Electronic Engineering,Mechanical Engineering,Control and Systems Engineering

Reference34 articles.

1. Design and implementation of FPGA-based gesture recognition system;Chen;Wirel. Internet Technol.,2020

2. Del Rio Guerra, M.S., and Martin-Gutierrez, J. (2020). Evaluation of Full-Body Gestures Performed by Individuals with Down Syndrome: Proposal for Designing User Interfaces for All Based on Kinect Sensor. Sensors, 20.

3. Siddiqui, U.A., Ullah, F., Iqbal, A., Khan, A., Ullah, R., Paracha, S., Shahzad, H., and Kwak, K.S. (2021). Wearable-Sensors-Based Platform for Gesture Recognition of Autism Spectrum Disorder Children Using Machine Learning Algorithms. Sensors, 21.

4. Ye, S. (2020). Research on Hand Gesture Recognition Based on Multi-MEMS Inertial Sensors. [Master’s Thesis, Nanjing University of Aeronautics and Astronautics].

5. Lin, Q. (2020). The Research of Hand Detection and Tracking Using Kinect. [Master’s Thesis, Nanjing University of Posts and Telecommunications].

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3