Abstract
Unmanned aerial vehicle (UAV) vision technology is becoming increasingly important, especially in wilderness rescue. For humans in the wilderness with poor network conditions and bad weather, this paper proposes a technique for road extraction and road condition detection from video captured by UAV multispectral cameras in real-time or pre-downloaded multispectral images from satellites, which in turn provides humans with optimal route planning. Additionally, depending on the flight altitude of the UAV, humans can interact with the UAV through dynamic gesture recognition to identify emergency situations and potential dangers for emergency rescue or re-routing. The purpose of this work is to detect the road condition and identify emergency situations in order to provide necessary and timely assistance to humans in the wild. By obtaining a normalized difference vegetation index (NDVI), the UAV can effectively distinguish between bare soil roads and gravel roads, refining the results of our previous route planning data. In the low-altitude human–machine interaction part, based on media-pipe hand landmarks, we combined machine learning methods to build a dataset of four basic hand gestures for sign for help dynamic gesture recognition. We tested the dataset on different classifiers, and the best results show that the model can achieve 99.99% accuracy on the testing set. In this proof-of-concept paper, the above experimental results confirm that our proposed scheme can achieve our expected tasks of UAV rescue and route planning.
Funder
Hungarian National Science Foundation
Subject
General Earth and Planetary Sciences
Cited by
8 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献