Author:
Novikov David,Sotirelis Paul,Yilmaz Alper
Abstract
Abstract. We have developed a robust, novel, and cost-effective method for determining the geolocation of vehicles observed in drone camera footage. Previous studies in this area have relied on platform GPS and camera geometry to estimate the position of objects in drone footage, which we will refer to as object-to-drone location (ODL). The performance of these techniques is degraded with decreasing GPS measurement accuracy and camera orientation problems. Our method overcomes these shortcomings and reliably geolocates objects on the ground. We refer to our approach as object-to-map localization (OML). The proposed technique determines a transformation between drone camera footage and georectified aerial images, for example, from Google Maps. This transformation is then used to calculate the positions of objects captured in the drone camera footage. We provide an ablation study of our method’s configuration parameter, which are: feature extraction methods, key point filtering schemes, and types of transformations. We also conduct experiments with a simulated faulty GPS to demonstrate our method’s robustness to poor estimation of the drone’s position. Our approach requires only a drone with a camera and a low-accuracy estimate of its geoposition, we do not rely on markers or ground control points. As a result, our method can determine the geolocation of vehicles on the ground in an easy-to-set up and costeffective manner, making object geolocalization more accessible to users by decreasing the hardware and software requirements. Our GitHub with code can be found at https://github.com/OSUPCVLab/VehicleGeopositioning