Affiliation:
1. Department of Electrical and Computer Engineering, Inha University, Incheon 22212, Republic of Korea
2. Department of Mechanical Engineering, Seoul National University, Seoul 01811, Republic of Korea
Abstract
We propose an online dehazing method with sparse depth priors using an incremental Gaussian Process (iGP). Conventional approaches focus on achieving single image dehazing by using multiple channels. In many robotics platforms, range measurements are directly available, except in a sparse form. This paper exploits direct and possibly sparse depth data in order to achieve efficient and effective dehazing that works for both color and grayscale images. The proposed algorithm is not limited to the channel information and works equally well for both color and gray images. However, efficient depth map estimations (from sparse depth priors) are additionally required. This paper focuses on a highly sparse depth prior for online dehazing. For efficient dehazing, we adopted iGP for incremental depth map estimation and dehazing. Incremental selection of the depth prior was conducted in an information-theoretic way by evaluating mutual information (MI) and other information-based metrics. As per updates, only the most informative depth prior was added, and haze-free images were reconstructed from the atmospheric scattering model with incrementally estimated depth. The proposed method was validated using different scenarios, color images under synthetic fog, real color, and grayscale haze indoors, outdoors, and underwater scenes.
Funder
Inha University
Korea Institute of Marine Science and Technology Promotion
Ministry of Oceans and Fisheries
National Research Foundation of Korea (NRF) grant funded by the Korea governmen
Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government
National Research Council of Science & Technology under the R&D Program of Ministry of Science, ICT and Future Planning
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Reference41 articles.
1. Real-time visual SLAM for autonomous underwater hull inspection using visual saliency;Kim;IEEE Trans. Robot.,2013
2. Engel, J., Schöps, T., and Cremers, D. (2014). European Conference on Computer Vision, Proceedings of the 13th European Conference, Zurich, Switzerland, 6–12 September 2014, Springer.
3. Deep learning for detecting robotic grasps;Lenz;Int. J. Robot. Res.,2015
4. Li, J., Eustice, R.M., and Johnson-Roberson, M. (2015, January 19–22). Underwater robot visual place recognition in the presence of dramatic appearance change. Proceedings of the IEEE/MTS OCEANS Conference and Exhibition, Washington, DC, USA.
5. Toward extraplanetary under-ice exploration: Robotic steps in the Arctic;Kunz;J. Field Robot.,2009
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献