Affiliation:
1. Department of Remote Sensing and GIS, Faculty of Geography, University of Tehran, Tehran 1417853933, Iran
2. Department of Exploration Technology, Helmholtz-Institute Freiberg for Resource Technology, 09599 Freiberg, Germany
Abstract
In contrast to the well-investigated field of Synthetic Aperture Radar (SAR)-to-Optical translation, this study explores the lesser-investigated domain of Optical-to-SAR translation, which is a challenging field due to the ill-posed nature of this translation. The complexity arises as single optical data can have multiple SAR representations based on the SAR viewing geometry. To generate an SAR image with a specific viewing geometry, we propose a novel approach, which is termed SAR Temporal Shifting. Our model takes an optical image from the target timestamp and an SAR image from a different temporal point but with a consistent viewing geometry as the expected SAR image. Both of these inputs are complemented with a change map derived from optical images during the intervening period. This model then modifies the SAR data based on the changes observed in the optical data to generate the SAR data for the desired timestamp. Although similar strategies have been explored in the opposite SAR-to-Optical translation, our approach innovates by introducing new spatial evaluation metrics and cost functions. These metrics reveal that simply adding same-domain data as model input, without accounting for the distribution changes in the dataset, can result in model overfitting—even if traditional metrics suggest positive outcomes. To address this issue, we have introduced a change-weighted loss function that discourages the model from merely replicating input data by assigning greater cost to changes in the areas of interest. Our approach surpasses traditional translation methods by eliminating the Generative Adversarial Network’s (GAN’s) fiction phenomenon by learning to change the SAR data based on the optical data instead of solely relying on translation. Furthering the field, we have introduced a novel automated framework to build a despeckled multitemporal SAR–Optical dataset with consistent viewing geometry. We provide the code and the dataset used in our study.
Reference67 articles.
1. Multitask GANs for Oil Spill Classification and Semantic Segmentation Based on SAR Images;Fan;IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.,2023
2. An improved generative adversarial networks for remote sensing image super-resolution reconstruction via multi-scale residual block;Zhu;Egypt. J. Remote Sens. Space Sci.,2023
3. Text-to-Remote-Sensing-Image Generation with Structured Generative Adversarial Networks;Zhao;IEEE Geosci. Remote Sens. Lett.,2022
4. A review and meta-analysis of Generative Adversarial Networks and their applications in remote sensing;Jozdani;Int. J. Appl. Earth Obs. Geoinf.,2022
5. Zhu, J.Y., Park, T., Isola, P., and Efros, A.A. (2017, January 22–29). Unpaired image-to-image translation using cycle-consistent adversarial networks. Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy.