Abstract
Within exponential families, which may consist of multi-parameter and multivariate distributions, a variety of divergence measures, such as the Kullback–Leibler divergence, the Cressie–Read divergence, the Rényi divergence, and the Hellinger metric, can be explicitly expressed in terms of the respective cumulant function and mean value function. Moreover, the same applies to related entropy and affinity measures. We compile representations scattered in the literature and present a unified approach to the derivation in exponential families. As a statistical application, we highlight their use in the construction of confidence regions in a multi-sample setup.
Subject
General Physics and Astronomy
Reference30 articles.
1. Statistical Inference Based on Divergence Measures;Pardo,2006
2. Convex Statistical Distances;Liese,1987
3. Theory of Statistical Inference and Information;Vajda,1989
4. Statistical Decision Theory: Estimation, Testing, and Selection;Liese,2008
5. Parametric estimation and tests through divergences and the duality technique
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献