Author:
Liu Sunxiangyu,Li Guitao,Zhan Yafeng,Gao Peng
Abstract
Accurate and robust drone detection is an important and challenging task. However, on this issue, previous research, whether based on appearance or motion features, has not yet provided a satisfactory solution, especially under a complex background. To this end, the present work proposes a motion-based method termed the Multi-Scale Space Kinematic detection method (MUSAK). It fully leverages the motion patterns by extracting 3D, pseudo 3D and 2D kinematic parameters at three scale spaces according to the keypoints quality and builds three Gated Recurrent Unit (GRU)-based detection branches for drone recognition. The MUSAK method is evaluated on a hybrid dataset named multiscale UAV dataset (MUD), consisting of public datasets and self-collected data with motion labels. The experimental results show that MUSAK improves the performance by a large margin, a 95% increase in average precision (AP), compared with the previous state-of-the-art (SOTA) motion-based methods, and the hybrid MUSAK method, which integrates with the appearance-based method Faster Region-based Convolutional Neural Network (Faster R-CNN), achieves a new SOTA performance on AP metrics (AP, APM, and APS).
Subject
General Earth and Planetary Sciences
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献