Affiliation:
1. The Chinese University of Hong Kong, HK SAR, China
2. Hunan University, China
Abstract
This paper presents a new approach for 3D shape generation, inversion, and manipulation, through a direct generative modeling on a continuous implicit representation in wavelet domain. Specifically, we propose a
compact wavelet representation
with a pair of coarse and detail coefficient volumes to implicitly represent 3D shapes via truncated signed distance functions and multi-scale biorthogonal wavelets. Then, we design a pair of neural networks: a diffusion-based
generator
to produce diverse shapes in the form of the coarse coefficient volumes and a
detail predictor
to produce compatible detail coefficient volumes for introducing fine structures and details. Further, we may jointly train an
encoder network
to learn a latent space for inverting shapes, allowing us to enable a rich variety of whole-shape and region-aware shape manipulations. Both quantitative and qualitative experimental results manifest the compelling shape generation, inversion, and manipulation capabilities of our approach over the state-of-the-art methods.
Funder
Research Grants Council of the Hong Kong Special Administrative Region
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Graphics and Computer-Aided Design
Reference86 articles.
1. StyleFlow: Attribute-conditioned Exploration of StyleGAN-Generated Images using Conditional Continuous Normalizing Flows
2. Panos Achlioptas, Olga Diamanti, Ioannis Mitliagkas, and Leonidas J. Guibas. 2018. Learning representations and generative models for 3D point clouds. In Proceedings of International Conference on Machine Learning (ICML). 40–49.
3. Martin Arjovsky, Soumith Chintala, and Léon Bottou. 2017. Wasserstein generative adversarial networks. In Proceedings of International Conference on Machine Learning (ICML). 214–223.
4. SAL: Sign Agnostic Learning of Shapes From Raw Data
5. Semantic photo manipulation with a generative image prior
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献