ChromaFlash: Snapshot Hyperspectral Imaging Using Rolling Shutter Cameras

Author:

Verma Dhruv1ORCID,Ruffolo Ian1ORCID,Lindell David B.1ORCID,Kutulakos Kiriakos N.1ORCID,Mariakakis Alex1ORCID

Affiliation:

1. University of Toronto, Toronto, Canada

Abstract

Hyperspectral imaging captures scene information across narrow, contiguous bands of the electromagnetic spectrum. Despite its proven utility in industrial and biomedical applications, its ubiquity has been limited by bulky form factors, slow capture times, and prohibitive costs. In this work, we propose a generalized approach to snapshot hyperspectral imaging that only requires a standard rolling shutter camera and wavelength-adjustable lighting. The crux of this approach entails using the rolling shutter as a spatiotemporal mask, varying incoming light quicker than the camera's frame rate in order for the captured image to contain rows of pixels illuminated at different wavelengths. An image reconstruction pipeline then converts this coded image into a complete hyperspectral image using sparse optimization. We demonstrate the feasibility of this approach by deploying a low-cost system called ChromaFlash, which uses a smartphone's camera for image acquisition and a series of LEDs to change the scene's illumination. We evaluated ChromaFlash through simulations on two public hyperspectral datasets and assessed its spatial and spectral accuracy across various system parameters. We also tested the real-world performance of our prototype by capturing diverse scenes under varied ambient lighting conditions. In both experiments, ChromaFlash outperformed state-of-the-art techniques that use deep learning to convert RGB images into hyperspectral ones, achieving snapshot performance not demonstrated by prior attempts at accessible hyperspectral imaging.

Funder

Natural Sciences and Engineering Research Council of Canada

Ontario Research Fund

Canada Foundation for Innovation

Publisher

Association for Computing Machinery (ACM)

Reference68 articles.

1. MeCap: Whole-Body Digitization for Low-Cost VR/AR Headsets

2. Naveed Akhtar and Ajmal Mian. 2018. Hyperspectral recovery from RGB images using Gaussian processes. IEEE transactions on pattern analysis and machine intelligence 42, 1 (2018), 100--113.

3. Aitor Alvarez-Gila, Joost Van De Weijer, and Estibaliz Garrote. 2017. Adversarial networks for spatial context-aware spectral image reconstruction from RGB. In Proceedings of the IEEE international conference on computer vision workshops. 480--490.

4. Nick Antipa, Patrick Oare, Emrah Bostan, Ren Ng, and Laura Waller. 2019. Video from stills: Lensless imaging with rolling shutter. In 2019 IEEE International Conference on Computational Photography (ICCP). IEEE, 1--8.

5. Boaz Arad, Radu Timofte, Rony Yahel, Nimrod Morag, Amir Bernat, Yuanhao Cai, Jing Lin, Zudi Lin, Haoqian Wang, Yulun Zhang, Hanspeter Pfister, Luc Van Gool, Shuai Liu, Yongqiang Li, Chaoyu Feng, Lei Lei, Jiaojiao Li, Songcheng Du, Chaoxiong Wu, Yihong Leng, Rui Song, Mingwei Zhang, Chongxing Song, Shuyi Zhao, Zhiqiang Lang, Wei Wei, Lei Zhang, Renwei Dian, Tianci Shan, Anjing Guo, Chengguo Feng, Jinyang Liu, Mirko Agarla, Simone Bianco, Marco Buzzelli, Luigi Celona, Raimondo Schettini, Jiang He, Yi Xiao, Jiajun Xiao, Qiangqiang Yuan, Jie Li, Liangpei Zhang, Taesung Kwon, Dohoon Ryu, Hyokyoung Bae, Hao-Hsiang Yang, Hua-En Chang, Zhi-Kai Huang, Wei-Ting Chen, Sy-Yen Kuo, Junyu Chen, Haiwei Li, Song Liu, Sabarinathan, K Uma, B Sathya Bama, and S. Mohamed Mansoor Roomi. 2022. NTIRE 2022 Spectral Recovery Challenge and Data Set. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops. 863--881.

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3