Abstract
AbstractAncient murals are important cultural heritages for our exploration of ancient civilizations and are of great research value. Due to long-time exposure to the environment, ancient murals often suffer from damage (deterioration) such as cracks, scratches, corrosion, paint loss, and even large-region falling off. It is an urgent work to protect and restore these damaged ancient murals. Mural inpainting techniques refer to virtually filling the deteriorated regions by reconstructing the structure and texture elements of the mural images. Most existing mural inpainting approaches fail to fill loss contents that contain complex structures and diverse patterns since they neglect the importance of structure guidance. In this paper, we propose a structure-guided two-branch model based on the generative adversarial network (GAN) for ancient mural inpainting. In the proposed model, the mural inpainting process can be divided into two stages: structure reconstruction and content restoration. These two stages are conducted by using a structure reconstruction network (SRN) and a content restoration network (CRN), respectively. In the structure reconstruction stage, SRN employs the Gated Convolution and the Fast Fourier Convolution (FFC) residual block to reconstruct the missing structures of the damaged murals. In the content restoration stage, CRN uses the structures (generated by SRN) to guide the missing content restoration of the murals. We design a two-branch parallel encoder to improve the texture and color restoration quality for the missing regions of the murals. Moreover, we propose a cascade attention module that can capture long-term relevance information in the deep features. It helps to alleviate the texture-blur and color-bias problem. We conduct experiments on both simulated and real damaged murals, and compare our inpainting results with other four competitive approaches. Experimental results show that our proposed model outperforms other approaches in terms of texture clarity, color consistency and structural continuity of the restored mural images. In addition, the mural inpainting results of our model can achieve comparatively high quantitative evaluation metrics.
Funder
Postgraduate Research and Innovation Foundation of Yunnan University
National Natural Science Foundation of China
Applied Basic Research Project of Yunnan Province
Publisher
Springer Science and Business Media LLC
Subject
Archeology,Archeology,Conservation,Computer Science Applications,Materials Science (miscellaneous),Chemistry (miscellaneous),Spectroscopy
Reference36 articles.
1. Yue YQ. Condition surveys of deterioration and research of wall paintings in Maijishan cave-temple. Study Nat Cult Herit. 2019;4(2):127–31 (in Chinese with an English abstract).
2. Bertalmio M, Sapiro G, Caselles V, et al. Image inpainting. Proceedings of the 27th annual conference on Computer graphics and interactive techniques. 2000: 417-424.
3. Jaidilert S, Farooque G. Crack detection and images inpainting method for Thai mural painting images. 2018 IEEE 3rd International Conference on Image, Vision and Computing (ICIVC). IEEE, 2018: 143–148.
4. Chen Y, Ai YP, Guo HG. Inpainting algorithm for Dunhuang Mural based on improved curvature-driven diffusion model. J Comput-Aided Design Comput Graph. 2020;32(05):787–96 (in Chinese with an English abstract).
5. Criminisi A, Perez P, Toyama K. Object removal by exemplar-based inpainting. 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings. IEEE, 2003, 2: II-II.
Cited by
12 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献