Abstract
Streaming of 360° content is gaining attention as an immersive way to remotely experience live events. However live capture is presently limited to 2D content due to the prohibitive computational cost associated with multi-camera rigs. In this work we present a system that directly captures streaming 3D virtual reality content. Our approach does not suffer from spatial or temporal seams and natively handles phenomena that are challenging for existing systems, including refraction, reflection, transparency and speculars. Vortex natively captures in the omni-directional stereo (ODS) format, which is widely supported by VR displays and streaming pipelines. We identify an important source of distortion inherent to the ODS format, and demonstrate a simple means of correcting it. We include a detailed analysis of the design space, including tradeoffs between noise, frame rate, resolution, and hardware complexity. Processing is minimal, enabling live transmission of immersive, 3D, 360° content. We construct a prototype and demonstrate capture of 360° scenes at up to 8192 X 4096 pixels at 5 fps, and establish the viability of operation up to 32 fps.
Funder
National Science Foundation
Intel Corporation
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Graphics and Computer-Aided Design
Reference27 articles.
1. Michael Adam Christoph Jung Stefan Roth and Guido Brunnett. 2009. Real-time Stereo-Image Stitching using GPU-based Belief Propagation. In Vision Modeling and Visualization Workshop (VMV). 215--224. Michael Adam Christoph Jung Stefan Roth and Guido Brunnett. 2009. Real-time Stereo-Image Stitching using GPU-based Belief Propagation. In Vision Modeling and Visualization Workshop (VMV). 215--224.
2. Jump
3. Three-hundred-sixty degree electroholographic stereogram and volumetric display system
4. Multidirectional stereovision sensor, calibration and scenes reconstruction;Benosman R.;Proc. ICPR,1996
Cited by
27 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献