Affiliation:
1. School of Computer Science and Technology, Xidian University, Xi’an 710071, China
2. NavInfo Co., Ltd., Beijing 100094, China
Abstract
Current CNN-based methods for infrared and visible image fusion are limited by the low discrimination of extracted structural features, the adoption of uniform loss functions, and the lack of inter-modal feature interaction, which make it difficult to obtain optimal fusion results. To alleviate the above problems, a framework for multimodal feature learning fusion using a cross-attention Transformer is proposed. To extract rich structural features at different scales, residual U-Nets with mixed receptive fields are adopted to capture salient object information at various granularities. Then, a hybrid attention fusion strategy is employed to integrate the complementing information from the input images. Finally, adaptive loss functions are designed to achieve optimal fusion results for different modal features. The fusion framework proposed in this study is thoroughly evaluated using the TNO, FLIR, and LLVIP datasets, encompassing diverse scenes and varying illumination conditions. In the comparative experiments, HATF achieved competitive results on three datasets, with EN, SD, MI, and SSIM metrics reaching the best performance on the TNO dataset, surpassing the second-best method by 2.3%, 18.8%, 4.2%, and 2.2%, respectively. These results validate the effectiveness of the proposed method in terms of both robustness and image fusion quality compared to several popular methods.
Funder
Natural Science Basic Research Program of Shaanxi
Aeronautical Science Foundation of China
Reference51 articles.
1. Infrared and visible image fusion methods and applications: A survey;Ma;Inf. Fusion,2019
2. Tian, Z., Shen, C., Chen, H., and He, T. (November, January 27). Fcos: Fully convolutional one-stage object detection. Proceedings of the IEEE/CVF International Conference on Computer Vision, Seoul, Republic of Korea.
3. A monocular wide-field vision system for geolocation with uncertainties in urban scenes;Arroyo;Eng. Res. Express,2020
4. Feature level image fusion of optical imagery and Synthetic Aperture Radar (SAR) for invasive alien plant species detection and mapping;Rajah;Remote Sens. Appl. Soc. Environ.,2018
5. Pan-GAN: An unsupervised pan-sharpening method for remote sensing image fusion;Ma;Inf. Fusion,2020