Author:
Wang Sen,Wang Runxiao,Zuo Xinxin,Yu Weiwei
Abstract
During the last few years, Time-of-Flight(TOF) sensor achieved a significant impact onto research and industrial fields due to that it can capture depth easily. For dynamic scenes and phase fusion, ToF sensor's working principles can lead to significant artifacts, therefore an efficient method to combine motion compensation and kernel density estimate multi-frequency unwrapping is proposed. Firstly, the raw multi-phase images are captured, then calculate the optical flow between each frequency. Secondly, by generating multiple depth hypotheses, uses a spatial kernel density estimation is used to rank them with wrapped phase images. Finally, the accurate depth from fused phase image is gotten. The algorithm on Kinect V2 is validated and the pixel-wise part is optimized using GPU. The method shows its real time superior performance on real datasets.