Abstract
By calculating the Kullback–Leibler divergence between two probability measures belonging to different exponential families dominated by the same measure, we obtain a formula that generalizes the ordinary Fenchel–Young divergence. Inspired by this formula, we define the duo Fenchel–Young divergence and report a majorization condition on its pair of strictly convex generators, which guarantees that this divergence is always non-negative. The duo Fenchel–Young divergence is also equivalent to a duo Bregman divergence. We show how to use these duo divergences by calculating the Kullback–Leibler divergence between densities of truncated exponential families with nested supports, and report a formula for the Kullback–Leibler divergence between truncated normal distributions. Finally, we prove that the skewed Bhattacharyya distances between truncated exponential families amount to equivalent skewed duo Jensen divergences.
Subject
General Physics and Astronomy
Reference37 articles.
1. Statistical Modelling by Exponential Families;Sundberg,2019
2. Sufficient Statistics and Intrinsic Accuracy;Pitman,1936
3. Sur les lois de probabilitéa estimation exhaustive;Darmois;CR Acad. Sci. Paris,1935
4. On distributions admitting a sufficient statistic
5. INTERPRETATION OF THE QUASI-LIKELIHOOD VIA THE TILTED EXPONENTIAL FAMILY
Cited by
12 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献