Author:
Bertlmann Reinhold A.,Friis Nicolai
Abstract
Abstract
In this first chapter of Part III we discuss the concept of entropy in classical systems, starting with an exposition of entropy in thermodynamics following Clausius and Boltzmann before examining entropy in the microcanonical and canonical ensembles of statistical physics, and stating Jaynes’ principle. This is contrasted with a discussion of Shannon’s entropy in classical information theory to quantify the information content of a message. We then move on to the discussion of more complicated entropic quantities such as the relative entropy, also called Kullback-Leibler divergence, as well as the joint entropy, conditional entropy, and mutual information. We explicitly prove various properties, relations between, and bounds on these quantities, including Gibbs’ inequality and the subadditivity of the joint entropy, and we illustrate the relations between these entropies in information diagrams. Finally, we consider the family of generalized entropic quantities called the Rényi entropies and Rényi divergences, and their respective hierarchies
Publisher
Oxford University PressOxford
Reference986 articles.
1. Generalized Schmidt Decomposition and Classification of Three-Quantum-Bit States.;Phys. Rev. Lett.,2000
2. Classification of mixed three-qubit states.;Phys. Rev. Lett.,2001
3. Grothendieck’s constant and local models for noisy entangled quantum states.;Phys. Rev. A,2006