Abstract
This paper aims to solve the task of coloring a sketch image given a ready-colored exemplar image. Conventional exemplar-based colorization methods tend to transfer styles from reference images to grayscale images by employing image analogy techniques or establishing semantic correspondences. However, their practical capabilities are limited when semantic correspondences are elusive. This is the case with coloring for sketches (where semantic correspondences are challenging to find) since it contains only edge information of the object and usually contains much noise. To address this, we present a framework for exemplar-based sketch colorization tasks that synthesizes colored images from sketch input and reference input in a distinct domain. Generally, we jointly proposed our domain alignment network, where the dense semantic correspondence can be established, with a simple but valuable adversarial strategy, that we term the structural and colorific conditions. Furthermore, we proposed to utilize a self-attention mechanism for style transfer from exemplar to sketch. It facilitates the establishment of dense semantic correspondence, which we term the spatially corresponding semantic transfer module. We demonstrate the effectiveness of our proposed method in several sketch-related translation tasks via quantitative and qualitative evaluation.
Funder
Opening Project of Guangdong Province Key Laboratory of Computational Science at the Sun Yat-Sen. University
Subject
General Mathematics,Engineering (miscellaneous),Computer Science (miscellaneous)
Reference69 articles.
1. A Neural Algorithm of Artistic Style
2. Texture synthesis using convolutional neural networks;Gatys;Adv. Neural Inf. Process. Syst.,2015
3. Image style transfer using convolutional neural networks;Gatys;Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition,2016
4. Inter-class sparsity based discriminative least square regression
5. Controlling perceptual factors in neural style transfer;Gatys;Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition,2017
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献