Author:
Xu Zhao,Liu Gang,Tang Li Li,Li Yan Hui
Abstract
Abstract
For ameliorate the drawback that useful information obtained through middle layers is lost in the conventional image fusion methods based on deep learning, an unsupervised deep learning framework based on Cascaded Convolutional Coding Networks (C3Net) is proposed for the fusion of infrared and visual images. A Blur Regional Features (BRF) scheme is also considered during fusion stage, so as to preserve the consistency of regions. Firstly, redundant and complementary features of infrared and visible images are obtained from the coding layer respectively. The output of each convolutional layer is connected to the input of the next layer in a cascading manner. Then, relying on the features of redundant features and complementary features, different fusion strategies are designed respectively based on BRF to obtain fusion feature maps. Finally, the fused image is reconstructed by decoding layer. Furthermore, the objective function of the training model is designed as a multitask loss function including Mean Square Error, Information Entropy and Structural Similarity, to reduce the loss of the original image information. The experimental results of C3Net fusion method is compared with state-of-the-art fusion methods, which is better synthesized performance in objective evaluation and subjective visual quality.
Subject
General Physics and Astronomy
Reference20 articles.
1. Infrared and visible image fusion methods and applications: A survey[J];Ma;Information Fusion,2019
2. Region Based Image fusion for detection of Ewing Sarcoma[C];Zaveri,2009
3. Infrared and visible image fusion with convolutional neural networks[J];Liu;International Journal of Wavelets, Multiresolution and Information Processing,2018
4. Deepfuse: A deep unsupervised approach for exposure fusion with extreme exposure image pairs;Prabhakar,2017
5. DenseFuse: A fusion approach to infrared and visible images;Li;IEEE Trans. Image Process,2019
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献