Affiliation:
1. University of Toronto, Canada
2. University of Toronto, Adobe Research, Canada
Abstract
Neural implicit representations, which encode a surface as the level set of a neural network applied to spatial coordinates, have proven to be remarkably effective for optimizing, compressing, and generating 3D geometry. Although these representations are easy to fit, it is not clear how to best evaluate geometric queries on the shape, such as intersecting against a ray or finding a closest point. The predominant approach is to encourage the network to have a signed distance property. However, this property typically holds only approximately, leading to robustness issues, and holds only at the conclusion of training, inhibiting the use of queries in loss functions. Instead, this work presents a new approach to perform queries directly on
general
neural implicit functions for a wide range of existing architectures. Our key tool is the application of range analysis to neural networks, using automatic arithmetic rules to bound the output of a network over a region; we conduct a study of range analysis on neural networks, and identify variants of affine arithmetic which are highly effective. We use the resulting bounds to develop geometric queries including ray casting, intersection testing, constructing spatial hierarchies, fast mesh extraction, closest-point evaluation, evaluating bulk properties, and more. Our queries can be efficiently evaluated on GPUs, and offer concrete accuracy guarantees even on randomly-initialized networks, enabling their use in training objectives and beyond. We also show a preliminary application to inverse rendering.
Funder
Canada Research Chairs
NSERC
Ontario Early Research Award Program
New Frontiers of Research Fund
Fields Institute for Research in Mathematical Sciences
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Graphics and Computer-Aided Design
Reference82 articles.
1. Bounding the search space for global optimization of neural networks learning error: an interval analysis approach;Adam Stavros P;Journal of Machine Learning Research,2016
2. Interval analysis: theory and applications
3. Martin Arjovsky , Soumith Chintala , and Léon Bottou . 2017 . Wasserstein generative adversarial networks . In International conference on machine learning. PMLR, 214--223 . Martin Arjovsky, Soumith Chintala, and Léon Bottou. 2017. Wasserstein generative adversarial networks. In International conference on machine learning. PMLR, 214--223.
4. SAL: Sign Agnostic Learning of Shapes From Raw Data
5. Matan Atzmon and Yaron Lipman . 2020 b. SALD: Sign Agnostic Learning with Derivatives. In International Conference on Learning Representations. Matan Atzmon and Yaron Lipman. 2020b. SALD: Sign Agnostic Learning with Derivatives. In International Conference on Learning Representations.
Cited by
28 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献