Abstract
We present Motion2Fusion, a state-of-the-art 360 performance capture system that enables *real-time* reconstruction of arbitrary non-rigid scenes. We provide three major contributions over prior work: 1) a new non-rigid fusion pipeline allowing for far more faithful reconstruction of high frequency geometric details, avoiding the over-smoothing and visual artifacts observed previously. 2) a high speed pipeline coupled with a machine learning technique for 3D correspondence field estimation reducing tracking errors and artifacts that are attributed to fast motions. 3) a backward and forward non-rigid alignment strategy that more robustly deals with topology changes but is still free from scene priors. Our novel performance capture system demonstrates real-time results nearing 3x speed-up from previous state-of-the-art work on the exact same GPU hardware. Extensive quantitative and qualitative comparisons show more precise geometric and texturing results with less artifacts due to fast motions or topology changes than prior art.
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Graphics and Computer-Aided Design
Reference51 articles.
1. Flow Fields: Dense Correspondence Fields for Highly Accurate Large Displacement Optical Flow Estimation
2. Automatic rigging and animation of 3D characters
3. Real-time high-fidelity facial performance capture
4. Chen Cao Yanlin Weng Stephen Lin and Kun Zhou. 2013. 3D Shape Regression for Real-time Facial Animation. ACM TOG 32 4 Article 41 (2013) 10 pages. 10.1145/2461912.2462012 Chen Cao Yanlin Weng Stephen Lin and Kun Zhou. 2013. 3D Shape Regression for Real-time Facial Animation. ACM TOG 32 4 Article 41 (2013) 10 pages. 10.1145/2461912.2462012
5. Meshed atlases for real-time procedural solid texturing
Cited by
122 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献