Super-Resolution Reconstruction of Remote Sensing Images of the China–Myanmar Pipeline Based on Generative Adversarial Network
-
Published:2023-08-30
Issue:17
Volume:15
Page:13068
-
ISSN:2071-1050
-
Container-title:Sustainability
-
language:en
-
Short-container-title:Sustainability
Author:
Jiang Yuanliang12, Ren Qingying34, Ren Yuan2, Liu Haipeng12, Dong Shaohua14, Ma Yundong14
Affiliation:
1. College of Safety and Ocean Engineering, China University of Petroleum (Beijing), Beijing 102249, China 2. CNPC International Pipeline Company, Beijing 102206, China 3. College of Artificial Intelligence, China University of Petroleum (Beijing), Beijing 102249, China 4. Key Laboratory of Oil and Gas Safety and Emergency Technology, Ministry of Emergency Management, Beijing 102249, China
Abstract
The safety monitoring and early warning of overseas oil and gas pipelines are essential for China, as these pipelines serve as significant energy channels. The route of the China–Myanmar pipeline in Myanmar is mountainous and covered with vegetation, changeable climate, and abundant rain, and is prone to disasters. Therefore, artificial route inspection is dangerous and inefficient. Satellite remote sensing technology has an advantage over a traditional ground survey due to its large range and long-distance capabilities, thus can aid in monitoring the safety of oil and gas transportation. To improve the resolution of remote sensing data, in this paper, we propose a Nonlocal dense receptive field generative adversarial network, using remote sensing images of the Muse section of the China–Myanmar pipeline as data. Based on super-resolution generative adversarial network (SRGAN), we use a dense residual structure to improve the network depth and introduce the Softsign activation function to optimize the problem of easy saturation and gradient disappearance. To extract deep features of the image, we proposed a residual-in-residual nonlocal (RRNL) dense block, followed by the addition of the receptive field block (RFB) mechanism at the end to extract global features. Four loss functions are combined to improve the stability of model training and the quality of reconstructed image. The experimental results show that the peak signal-to-noise ratio (PSNR) and structural similarity index measure (SSIM) of the reconstructed remote sensing image of Muse section reach 30.20 dB and 0.84. Compared to conventional methods and generic deep neural networks, the proposed approach achieved an improvement of 8.33 dB and 1.41 dB in terms of PSNR and an improvement of 21.7% and 5.9% in terms of SSIM. The reconstructed images exhibit improved texture clarity and are more visible to the human eye. This method successfully achieves super-resolution reconstruction of remote sensing images for the Muse section of the China–Myanmar pipeline, enhances the details of the image, and significantly improves the efficiency of satellite remote sensing monitoring.
Funder
China National Petroleum Corporation Limited—China University of Petroleum (Beijing) Strategic Cooperation Science and Technology Project
Subject
Management, Monitoring, Policy and Law,Renewable Energy, Sustainability and the Environment,Geography, Planning and Development,Building and Construction
Reference30 articles.
1. Lanczos Filtering in One and Two Dimensions;Duchon;J. Appl. Meteorol. Climatol.,1979 2. Qin, F., He, X., Wu, W., and Yang, X. (2008). International Conference on Information Computing and Automation, University of Electric Power Technology. 3. High-resolution image recovery from image-plane arrays, using convex projections;Stark;J. Opt. Soc. America. A Opt. Image Sci.,1989 4. Freeman, W.T., and Pasztor, E.C. (1999, January 20–27). Learning low-level vision. Proceedings of the Seventh IEEE International Conference on Computer Vision, Kerkyra, Greece. 5. Fleet, D., Pajdla, T., Schiele, B., and Tuytelaars, T. (2014, January 6–12). Learning a Deep Convolutional Network for Image Super-Resolution. Proceedings of the Computer Vision–ECCV 2014: 13th European Conference, Zurich, Switzerland. Proceedings, Part IV 13.
|
|