Affiliation:
1. Mathematical Institute Freie Universität Berlin Berlin Germany
Abstract
AbstractJensen's inequality is ubiquitous in measure and probability theory, statistics, machine learning, information theory and many other areas of mathematics and data science. It states that, for any convex function defined on a convex domain and any random variable taking values in , . In this paper, sharp upper and lower bounds on , termed ‘graph convex hull bounds’, are derived for arbitrary functions on arbitrary domains , thereby extensively generalizing Jensen's inequality. The derivation of these bounds necessitates the investigation of the convex hull of the graph of , which can be challenging for complex functions. On the other hand, once these inequalities are established, they hold, just like Jensen's inequality, for any ‐valued random variable . Therefore, these bounds are of particular interest in cases where is relatively simple and is complicated or unknown. Both finite‐ and infinite‐dimensional domains and codomains of are covered as well as analogous bounds for conditional expectations and Markov operators.
Funder
Deutsche Forschungsgemeinschaft
Reference20 articles.
1. Analysis and Geometry of Markov Diffusion Operators
2. Convex Analysis and Nonlinear Optimization
3. Further reverse results for Jensen's discrete inequality and applications in information theory;Budimir I.;JIPAM. J. Inequal. Pure Appl. Math.,2001
4. Vector Measures