Multi-Scale Feature Fusion with Attention Mechanism Based on CGAN Network for Infrared Image Colorization
-
Published:2023-04-07
Issue:8
Volume:13
Page:4686
-
ISSN:2076-3417
-
Container-title:Applied Sciences
-
language:en
-
Short-container-title:Applied Sciences
Author:
Ai Yibo12, Liu Xiaoxi1, Zhai Haoyang1, Li Jie3, Liu Shuangli4, An Huilong3, Zhang Weidong1
Affiliation:
1. National Center for Materials Service Safety, University of Science and Technology Beijing, Beijing 100083, China 2. Southern Marine Science and Engineering Guangdong Laboratory (Zhuhai), Zhuhai 519082, China 3. HBIS Materials Institute, No. 385 South Sports Street, Yuhua District, Shijiazhuang 050023, China 4. Hesteel Group Tangsteel Company, No. 9 Binhe Road, Tangshan 063000, China
Abstract
This paper proposes a colorization algorithm for infrared images based on a Conditional Generative Adversarial Network (CGAN) with multi-scale feature fusion and attention mechanisms, aiming to address issues such as color leakage and unclear semantics in existing infrared image coloring methods. Firstly, we improved the generator of the CGAN network by incorporating a multi-scale feature extraction module into the U-Net architecture to fuse features from different scales, thereby enhancing the network’s ability to extract features and improving its semantic understanding, which improves the problems of color leakage and blurriness during colorization. Secondly, we enhanced the discriminator of the CGAN network by introducing an attention mechanism module, which includes channel attention and spatial attention modules, to better distinguish between real and generated images, thereby improving the semantic clarity of the resulting infrared images. Finally, we jointly improved the generator and discriminator of the CGAN network by incorporating both the multi-scale feature fusion module and attention mechanism module. We tested our method on a dataset containing both infrared and near-infrared images, which retains more detailed features while also preserving the advantages of existing infrared images. The experimental results show that our proposed method achieved a peak signal-to-noise ratio (PSNR) of 16.5342 dB and a structural similarity index (SSIM) of 0.6385 on an RGB-NIR (Red, Green, Blue-Near Infrared) testing dataset, representing a 5% and 13% improvement over the original CGAN network, respectively. These results demonstrate the effectiveness of our proposed algorithm in addressing the issues of color leakage and unclear semantics in the original network. The proposed method in this paper is not only applicable to infrared image colorization but can also be widely applied to the colorization of remote sensing and CT images.
Funder
HBIS Materials Institute Southern Marine Science and Engineering Guangdong Laboratory
Subject
Fluid Flow and Transfer Processes,Computer Science Applications,Process Chemistry and Technology,General Engineering,Instrumentation,General Materials Science
Reference31 articles.
1. Shi, M., Zhang, J.Q., Chen, S.Y., Gao, L., Lai, Y.K., and Zhang, F.L. (2020). Deep Line Art Video Colorization with a Few References. arXiv. 2. Remote sensing image colorization using symmetrical multi-scale DCGAN in YUV color space;Wu;Vis. Comput.,2020 3. Zheng, C.Y., Fu, Y., Zhao, Z., Wang, C., and Nie, J. (2020, January 13–15). Imbalance Satellite Image Colorization with Semantic Salience Priors. Proceedings of the Twelfth International Conference on Graphics and Image Processing (ICGIP 2020), Xi’an, China. 4. Khan, M., Gotoh, Y., and Nida, N. (2017, January 11–13). Medical image colorization for better visualization and segmentation. Proceedings of the Annual Conference on Medical Image Understanding and Analysis, MIUA 2017, Edinburgh, UK. 5. Golyadkin, M., and Makarov, I. (2020, January 15–16). Semi-automatic Manga Colorization Using Conditional Adversarial Networks. Proceedings of the Analysis of Images, Social Networks and Texts: 9th International Conference, AIST 2020, Moscow, Russia.
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|