Abstract
Fluorescence lifetime imaging (FLI) provides unique quantitative information in biomedical and molecular biology studies but relies on complex data-fitting techniques to derive the quantities of interest. Herein, we propose a fit-free approach in FLI image formation that is based on deep learning (DL) to quantify fluorescence decays simultaneously over a whole image and at fast speeds. We report on a deep neural network (DNN) architecture, named fluorescence lifetime imaging network (FLI-Net) that is designed and trained for different classes of experiments, including visible FLI and near-infrared (NIR) FLI microscopy (FLIM) and NIR gated macroscopy FLI (MFLI). FLI-Net outputs quantitatively the spatially resolved lifetime-based parameters that are typically employed in the field. We validate the utility of the FLI-Net framework by performing quantitative microscopic and preclinical lifetime-based studies across the visible and NIR spectra, as well as across the 2 main data acquisition technologies. These results demonstrate that FLI-Net is well suited to accurately quantify complex fluorescence lifetimes in cells and, in real time, in intact animals without any parameter settings. Hence, FLI-Net paves the way to reproducible and quantitative lifetime studies at unprecedented speeds, for improved dissemination and impact of FLI in many important biomedical applications ranging from fundamental discoveries in molecular and cellular biology to clinical translation.
Funder
HHS | NIH | National Institute of Biomedical Imaging and Bioengineering
HHS | NIH | National Cancer Institute
Publisher
Proceedings of the National Academy of Sciences
Cited by
122 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献