Abstract
Abstract
In this work, we introduce a novel estimator of the predictive risk with Poisson data, when the loss function is the Kullback–Leibler divergence, in order to define a regularization parameter’s choice rule for the expectation maximization (EM) algorithm. To this aim, we prove a Poisson counterpart of the Stein’s Lemma for Gaussian variables, and from this result we derive the proposed estimator showing its analogies with the well-known Stein’s unbiased risk estimator valid for a quadratic loss. We prove that the proposed estimator is asymptotically unbiased with increasing number of measured counts, under certain mild conditions on the regularization method. We show that these conditions are satisfied by the EM algorithm under the hypothesis that the underlying matrix has positive entries and then we apply this estimator to select the EM optimal reconstruction. We present some numerical tests in the case of image deconvolution, comparing the performances of the proposed estimator with other methods available in the literature, both in the inverse crime and non-inverse crime setting.
Subject
Applied Mathematics,Computer Science Applications,Mathematical Physics,Signal Processing,Theoretical Computer Science
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献