A Positive-Unlabeled Generative Adversarial Network for Super-Resolution Image Reconstruction Using a Charbonnier Loss
Author:
Xu Shuhua,Qi Mingming,Wang Xianming,Zhao Hanli,Hu Zhongyi,Sun Hongyu
Abstract
Recently, the generative adversarial network (GAN) has been widely used to obtain the real high-frequency details of images. This spurs the application of GAN in super-resolution reconstruction. However, GAN is unstable in the training process, for the following two reasons: Firstly, the discriminator in GAN keeps the positive (true) and negative (false) criteria of the generated samples unchanged throughout the learning process, without considering the gradual quality improvement of the generated samples (Sometimes, the generated samples are even more realistic than the real samples). To solve the above problems, this paper proposes a super-resolution model based on positive-unlabeled (PU)-GAN-Charbon (SRPUGAN-Charbon). The proposed model includes one generator network that synthetizes super-resolution images and one discriminator network trained to distinguish super-resolution images from real high-resolution images. In addition, the Charbonnier loss function was called to handle the outliers in super-resolution images, and retain the low-frequency features of super-resolution images. Extensive experiments were conducted on three benchmark databases, including BSDS500, Set5, and Set14. The results show that the proposed SRPUGAN-Charbon method is superior to the most advanced methods in terms of visual effect, peak signal-to-noise ratio (PSNR), and structural similarity (SSIM).
Funder
Natural Science Foundation of Zhejiang Province
National Social Science Foundation of China
Project of Wenzhou Key Laboratory Foundation
Key scientific and technological innovation projects of Wenzhou science and technology plan
Shandong Provincial Natural Science Foundation
Publisher
International Information and Engineering Technology Association
Subject
Electrical and Electronic Engineering
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献