1. Ben-Tal, A., Ben-Israel, A., Teboulle, M.: Certainty equivalents and information measures: duality and extremal principles. J. Math. Anal. Appl. 157, 211–236 (1991)
2. Crooks, G.E.: On Measures of Entropy and Information, Tech. Note 009 v0.8. http://threeplusone.com/info (2021)
3. Csiszár, I.: Information-type measures of differences of probability distributions and indirect observations. Studia Sci. Math. Hung. 2, 299–318 (1967)
4. Csiszár, I., Körner, J.: Information Theory: Coding Theorems for Discrete Memory-less Systems. Academic Press, New York (1981)
5. Dragomir, S.S.: Upper and lower bounds for Csiszár $$f$$-divergence in terms of the Kullback-Leibler distance and applications, in Inequalities for the Csiszár $$f$$-divergence in Information Theory, ed. S. S. Dragomir (2000). http://rgmia.vu.edu.au/monographs/csiszar.htm)