Abstract
AbstractMesh texture synthesis is a key component in the automatic generation of 3D content. Existing learning‐based methods have drawbacks—either by disregarding the shape manifold during texture generation or by requiring a large number of different views to mitigate occlusion‐related inconsistencies. In this paper, we present a novel surface‐aware approach for mesh texture synthesis that overcomes these drawbacks by leveraging the pre‐trained weights of 2D Convolutional Neural Networks (CNNs) with the same architecture, but with convolutions designed for 3D meshes. Our proposed network keeps track of the oriented patches surrounding each texel, enabling seamless texture synthesis and retaining local similarity to classical 2D convolutions with square kernels. Our approach allows us to synthesize textures that account for the geometric content of mesh surfaces, eliminating discontinuities and achieving comparable quality to 2D image synthesis algorithms. We compare our approach with state‐of‐the‐art methods where, through qualitative and quantitative evaluations, we demonstrate that our approach is more effective for a variety of meshes and styles, while also producing visually appealing and consistent textures on meshes.
Reference61 articles.
1. ChenT. Q. SchmidtM.:Fast patch‐based style transfer of arbitrary style 2016. doi:10.48550/arXiv.1612.04337. 2
2. High quality solid texture synthesis using position and index histogram matching
3. CaoX. WangW. NagaoK.:Neural style transfer for point clouds 2019. doi:10.48550/arXiv.1903.05807. 2
4. CaoX. WangW. NagaoK. NakamuraR.: PSNet: A Style Transfer Network for Point Cloud Stylization on Geometry and Color. In2020 IEEE Winter Conference on Applications of Computer Vision (WACV)(2020) pp.3326–3334. doi:10.1109/WACV45572.2020.9093513. 2
5. ChenY. YuanQ. LiZ. LiuY. WangW. XieC. WenX. YuQ.:UPST‐NeRF: Universal Photorealistic Style Transfer of Neural Radiance Fields for 3D Scene 2022. doi:10.48550/arXiv.2208.07059. 10
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献