Online Hand Gesture Detection and Recognition for UAV Motion Planning
Author:
Lu Cong1, Zhang Haoyang23ORCID, Pei Yu23, Xie Liang23, Yan Ye23, Yin Erwei23, Jin Jing45ORCID
Affiliation:
1. School of Information Science and Engineering, East China University of Science and Technology, Shanghai 200237, China 2. National Innovation Institute of Defense Technology, Academy of Military Sciences, Beijing 100071, China 3. Tianjin Artificial Intelligence Innovation Center, Tianjin 300450, China 4. Key Laboratory of Smart Manufacturing in Energy Chemical Process, Ministry of Education, East China University of Science and Technology, Shanghai 200237, China 5. Shenzhen Research Institute of East China University of Science and Technology, Shenzhen 518063, China
Abstract
Recent advances in hand gesture recognition have produced more natural and intuitive methods of controlling unmanned aerial vehicles (UAVs). However, in unknown and cluttered environments, UAV motion planning requires the assistance of hand gesture interaction in complex flight tasks, which remains a significant challenge. In this paper, a novel framework based on hand gesture interaction is proposed, to support efficient and robust UAV flight. A cascading structure, which includes Gaussian Native Bayes (GNB) and Random Forest (RF), was designed, to classify hand gestures based on the Six Degrees of Freedom (6DoF) inertial measurement units (IMUs) of the data glove. The hand gestures were mapped onto UAV’s flight commands, which corresponded to the direction of the UAV flight.The experimental results, which tested the 10 evaluated hand gestures, revealed the high accuracy of online hand gesture recognition under asynchronous detection (92%), and relatively low latency for interaction (average recognition time of 7.5 ms; average total time of 3 s).The average time of the UAV’s complex flight task was about 8 s shorter than that of the synchronous hand gesture detection and recognition. The proposed framework was validated as efficient and robust, with extensive benchmark comparisons in various complex real-world environments.
Funder
Science and technology innovation 2030 major projects National Natural Science Foundation of China Shanghai Municipal Science and Technology Major Project Program of Introducing Talents of Discipline to Universities through the 111 Project ShuGuang Project supported by the Shanghai Municipal Education Commission and the Shanghai Education Development Foundation Ministry of Education and Science of the Russian Federation Polish National Science Center National Government Guided Special Funds for Local Science and Technology Development Project of Jiangsu Province Science and Technology Plan Special Fund in 2022
Subject
Electrical and Electronic Engineering,Industrial and Manufacturing Engineering,Control and Optimization,Mechanical Engineering,Computer Science (miscellaneous),Control and Systems Engineering
Reference37 articles.
1. Oneata, D., and Cucu, H. (2019). Kite: Automatic speech recognition for unmanned aerial vehicles. arXiv. 2. Stereoscopic first person view system for drone navigation;Smolyanskiy;Front. Robot. AI,2017 3. How, D.N.T., Ibrahim, W.Z.F.B.W., and Sahari, K.S.M. (2018, January 5–8). A Dataglove Hardware Design and Real-Time Sign Gesture Interpretation. Proceedings of the 2018 Joint 10th International Conference on Soft Computing and Intelligent Systems (SCIS) and 19th International Symposium on Advanced Intelligent Systems (ISIS), Toyama, Japan. 4. Ilyina, I.A., Eltikova, E.A., Uvarova, K.A., and Chelysheva, S.D. (2022, January 13). Metaverse-Death to Offline Communication or Empowerment of Interaction?. Proceedings of the 2022 Communication Strategies in Digital Society Seminar (ComSDS), Saint Petersburg, Russia. 5. Serpiva, V., Karmanova, E., Fedoseev, A., Perminov, S., and Tsetserukou, D. (2021, January 9–13). DronePaint: Swarm Light Painting with DNN-based Gesture Recognition. Proceedings of the ACM SIGGRAPH 2021 Emerging Technologies, Virtual Event, USA.
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|