Abstract
Two Rényi-type generalizations of the Shannon cross-entropy, the Rényi cross-entropy and the Natural Rényi cross-entropy, were recently used as loss functions for the improved design of deep learning generative adversarial networks. In this work, we derive the Rényi and Natural Rényi differential cross-entropy measures in closed form for a wide class of common continuous distributions belonging to the exponential family, and we tabulate the results for ease of reference. We also summarise the Rényi-type cross-entropy rates between stationary Gaussian processes and between finite-alphabet time-invariant Markov sources.
Funder
Natural Sciences and Engineering Research Council
Subject
General Physics and Astronomy
Reference22 articles.
1. On measures of entropy and information;Rényi;Fourth Berkeley Symp. Math. Stat. Probab.,1961
2. α-Mutual Information;Verdú;Proceedings of the 2015 Information Theory and Applications Workshop (ITA),2015
3. RGAN: Rényi Generative Adversarial Network
4. The Case for Shifting the Renyi Entropy
5. Least kth-Order and Rényi Generative Adversarial Networks
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献