Affiliation:
1. School of Electronic Engineering, Pedagogical and Technological University of Colombia, Sogamoso 152210, Colombia
2. School of Systems and Computing Engineering, Pedagogical and Technological University of Colombia, Sogamoso 152210, Colombia
Abstract
In surveillance and monitoring systems, the use of mobile vehicles or unmanned aerial vehicles (UAVs), like the drone type, provides advantages in terms of access to the environment with enhanced range, maneuverability, and safety due to the ability to move omnidirectionally to explore, identify, and perform some security tasks. These activities must be performed autonomously by capturing data from the environment; usually, the data present errors and uncertainties that impact the recognition and resolution in the detection and identification of objects. The resolution in the acquisition of data can be improved by integrating data sensor fusion systems to measure the same physical phenomenon from two or more sensors by retrieving information simultaneously. This paper uses the constant turn and rate velocity (CTRV) kinematic model of a drone but includes the angular velocity not considered in previous works as a complementary alternative in Lidar and Radar data sensor fusion retrieved using UAVs and applying the extended Kalman filter (EKF) for the detection of moving targets. The performance of the EKF is evaluated by using a dataset that jointly includes position data captured from a LiDAR and a Radar sensor for an object in movement following a trajectory with sudden changes. Additive white Gaussian noise is then introduced into the data to degrade the data. Then, the root mean square error (RMSE) versus the increase in noise power is evaluated, and the results show an improvement of 0.4 for object detection over other conventional kinematic models that do not consider significant trajectory changes.
Funder
NATO Science for Peace Program
Subject
General Earth and Planetary Sciences
Reference13 articles.
1. Lee, S., Har, D., and Kum, D. (2016, January 5–6). Drone-Assisted Disaster Management: Finding Victims via Infrared Camera and Lidar Sensor Fusion. Proceedings of the 2016 3rd Asia-Pacific World Congress on Computer Science and Engineering (APWC on CSE), Nadi, Fiji.
2. Ki, M., Cha, J., and Lyu, H. (2018, January 17–19). Detect and avoid system based on multi-sensor fusion for UAV. Proceedings of the 2018 International Conference on Information and Communication Technology Convergence (ICTC), Jeju, Korea.
3. Steinbaeck, J., Steger, C., Holweg, G., and Druml, N. (2018, January 29–31). Design of a low-level radar and time-of-flight sensor fusion framework. Proceedings of the 2018 21st Euromicro Conference on Digital System Design (DSD), Prague, Czech Republic.
4. De Silva, V., Roche, J., and Kondoz, A. (2017). Fusion of LiDAR and Camera Sensor Data for Environment Sensing in Driverless Vehicles. arXiv, Available online: http://arxiv.org/abs/1710.06230.
5. Na, K., Byun, J., Roh, M., and Seo, B. (2014, January 3–7). Fusion of multiple 2D LiDAR and RADAR for object detection and tracking in all directions. Proceedings of the 2014 International Conference on Connected Vehicles and Expo (ICCVE), Vienna, Austria.
Cited by
13 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献