Affiliation:
1. Department of Mathematics University of Manchester Manchester M13 9PL UK
Abstract
Abstract
Both variance and entropy are commonly used measures for uncertainty. There exists many cases where variance is infinite and entropy is finite. In this note, we derive an upper bound illustrating the relationship between variance and entropy of random variables having a special class of distributions. We also derive an upper bound for kth absolute central moment proportional to entropy power for a special class of distributions.
Reference7 articles.
1. Chung, H. W.—Sadler, B. M.—Hero, A. O.: Bounds on variance for unimodal distributions IEEE Transactions on Information Theory 63 (2017), 6936–6949.
2. Cover, T. M.—Thomas, J. A.: Elements of Information Theory 2nd ed., Wiley, New York, 2006.
3. Huber, M. F.—Bailey, T.—Durrant-Whyte, H.—Hanebeck, U. D.: On entropy approximation for Gaussian mixture random vectors In: Proceedings of 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, 2008, pp. 181–188.
4. Kumar, P.—Hooda, D. S.: On generalized measures of entropy and dependence Math. Slovaca 58 (2008), 377–386.
5. Mitrinovic, D. S.—Vasic, P. M.: Analytic Inequalities Vol. 1, Springer, New York, 1970.