Affiliation:
1. School of Information Science and Technology, North China University of Technology, Beijing 100144, P. R. China
Abstract
To address the issues in clothing pattern style migration, this paper proposes a digital simulation and re-editing method for clothing patterns based on deep learning and somatosensory interaction. First, the proposed method encodes the black-and-white line drawing image, generating random noise images through a diffusion process, introducing color information for synthesis, and using a decoder to reconstruct a colored image. Afterwards, an improved VGG19 model is used to reconstruct content features and perform linear color transformation on style images, enabling pattern style migration through the construction of a Gram matrix and resulting in colored clothing texture patterns. Finally, a KinectV2 is utilized for fabric simulation, overlaying colorful clothing texture patterns to achieve 3D virtual dressing. The experimental results show that the proposed method improves the structural similarity index measure (SSIM) by 9–11% and the peak signal-to-noise ratio (PSNR) by 3–8% when compared to existing algorithms. The experiments provide evidence that the proposed method effectively mitigates color overflow, delivers precise image coloring, and accomplishes realistic restoration of clothing texture. Furthermore, the method offers an improved garment fit to fulfill the user’s interaction requirements.
Funder
Beijing Social Science Foundation
Publisher
World Scientific Pub Co Pte Ltd
Subject
Artificial Intelligence,Computer Vision and Pattern Recognition,Software