Abstract
AbstractFirefighters need to gain information from both inside and outside of buildings in first response emergency scenarios. For this purpose, drones are beneficial. This paper presents an elicitation study that showed firefighters’ desires to collaborate with autonomous drones. We developed a Human–Drone interaction (HDI) method for indicating a target to a drone using 3D pointing gestures estimated solely from a monocular camera. The participant first points to a window without using any wearable or body-attached device. Through the drone’s front-facing camera, the drone detects the gesture and computes the target window. This work includes a description of the process for choosing the gesture, detecting and localizing objects, and carrying out the transformations between coordinate systems. Our proposed 3D pointing gesture interface improves on 2D interfaces by integrating depth information with SLAM and solving ambiguity with multiple objects aligned on the same plane in a large-scale outdoor environment. Experimental results showed that our 3D pointing gesture interface obtained average F1 scores of 0.85 and 0.73 for precision and recall in simulation and real-world experiments and an F1 score of 0.58 at the maximum distance of 25 m between the drone and building.
Funder
Khalifa University of Science, Technology and Research
Ministry of Education, Culture, Sports, Science and Technology
Office of Naval Research Global
Publisher
Springer Science and Business Media LLC
Subject
Artificial Intelligence,Control and Optimization,Mechanical Engineering,Instrumentation,Modelling and Simulation
Reference36 articles.
1. Al-Eidan RM, Al-Khalifa H, Al-Salman AM (2018) A review of wrist-worn wearable: sensors, models, and challenges. J Sens Hindawi. https://doi.org/10.1155/2018/5853917
2. Blum, R., 2008. Linux command line and shell scripting bible (Vol. 481). John Wiley & Sons
3. Bacim F, Nabiyouni M, Bowman DA (2014) Slice-n-Swipe: a free-hand gesture user interface for 3D point cloud annotation. In: IEEE symposium on 3D user interfaces (3DUI), pp 185–186
4. Jeong S, Jin J, Song T, Kwon K, Jeon JW (2012) Single-camera dedicated television control system using gesture drawing. IEEE Trans Consum Electr 58(4):1129–1137
5. Medeiros AC, Tavares TA, da Fonseca IE (2018) How to design an user interface based on gestures? In: International conference of design, user experience, and usability. Springer, Cham, pp 63–74
Cited by
9 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Multimodal Error Correction with Natural Language and Pointing Gestures;2023 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW);2023-10-02
2. Pointing Gestures for Human-Robot Interaction with the Humanoid Robot Digit;2023 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN);2023-08-28
3. Towards Gesture-Based Cooperation with Cargo Handling Unmanned Aerial Vehicles;International Journal of Semantic Computing;2023-08-03
4. Effects of Spatial Characteristics on the Human–Robot Communication Using Deictic Gesture in Construction;Journal of Construction Engineering and Management;2023-07
5. Diver Interest via Pointing: Human-Directed Object Inspection for AUVs;2023 IEEE International Conference on Robotics and Automation (ICRA);2023-05-29