Abstract
Head-mounted displays (HMDs) are becoming increasingly popular as a crucial component of virtual reality (VR). However, contemporary HMDs enforce a simple optical structure due to their constrained form factor, which impedes the use of multiple lens elements that can reduce aberrations in general. As a result, they introduce severe aberrations and imperfections in optical imagery, causing visual fatigue and degrading the immersive experience of being present in VR. To address this issue without modifying the hardware system, we present a novel, to the best of our knowledge, software-driven approach that compensates for the aberrations in HMDs in real time. Our approach involves pre-correction that deconvolves an input image to minimize the difference between its after-lens image and the ideal image. We characterize the specific wavefront aberration and point spread function (PSF) of the optical system using Zernike polynomials. To achieve higher computational efficiency, we improve the conventional deconvolution based on hyper-Laplacian prior by adopting a regularization constraint term based on L2 optimization and the input-image gradient. Furthermore, we implement our solution entirely on a graphics processing unit (GPU) to ensure constant and scalable real-time performance for interactive VR. Our experiments evaluating our algorithm demonstrate that our solution can reliably reduce the aberration of the after-lens images in real time.
Funder
Ministry of Science and ICT, South Korea
Reference63 articles.
1. Immersive 3D Telepresence
2. General-purpose telepresence with head-worn optical see-through displays and projector-based lighting;Maimone,2013
3. Distributed virtual reality: supporting remote collaboration in vehicle design
4. Correction of geometric distortions and the impact of eye position in virtual reality displays;Jones,2015
5. Gaze-dependent distortion correction for thick lenses in HMDS;Martschinke,2019