Abstract
The ability to interpret multimodal data, and map the targets and anomalies within, is important for an automatic recognition system. Due to the expensive and time-consuming nature of multimodal time-series data annotation in the training stage, multimodal time-series image understanding, from drone and quadruped mobile robot platforms, is a challenging task for remote sensing and photogrammetry. In this regard, robust methods must be computationally low-cost, due to the limited data on aerial and ground-based platforms, yet accurate enough to meet certainty measures. In this study, a few-shot learning architecture, based on a squeeze-and-attention structure, is proposed for multimodal target detection, using time-series images from the drone and quadruped robot platforms with a small training dataset. To build robust algorithms in target detection, a squeeze-and-attention structure has been developed from multimodal time-series images from limited training data as an optimized method. The proposed architecture was validated on three datasets with multiple modalities (e.g., red-green-blue, color-infrared, and thermal), achieving competitive results.
Subject
Artificial Intelligence,Computer Science Applications,Aerospace Engineering,Information Systems,Control and Systems Engineering
Reference45 articles.
1. Multiscale Anti-Deformation Network for Target Tracking in UAV Aerial Videos;Bi;JARS,2022
2. Vehicle Detection Method for Satellite Videos Based on Enhanced Vehicle Features;Lv;JARS,2022
3. Ghosh, U., Maleh, Y., Alazab, M., and Pathan, A.-S.K. (2021). Machine Intelligence and Data Analytics for Sustainable Future Smart Cities, Springer International Publishing. Studies in Computational Intelligence.
4. Performance of a Modified YOLOv3 Object Detector on Remotely Piloted Aircraft System Acquired Full Motion Video;Faraj;JARS,2022
5. Han, G., Ma, J., Huang, S., Chen, L., Chellappa, R., and Chang, S.-F. (2022). Multimodal Few-Shot Object Detection with Meta-Learning Based Cross-Modal Prompting. arXiv.
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献