Abstract
Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order
α
and the relative
α
-entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order
α
is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.
Subject
General Physics and Astronomy
Cited by
10 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献