Abstract
Abstract
The Shannon entropy, one of the cornerstones of information theory, is widely used in physics, particularly in statistical mechanics. Yet its characterization and connection to physics remain vague, leaving ample room for misconceptions and misunderstanding. We will show that the Shannon entropy can be fully understood as measuring the variability of the elements within a given distribution: it characterizes how much variation can be found within a collection of objects. We will see that it is the only indicator that is continuous and linear, that it quantifies the number of yes/no questions (i.e. bits) that are needed to identify an element within the distribution, and we will see how applying this concept to statistical mechanics in different ways leads to the Boltzmann, Gibbs and von Neumann entropies.
Funder
MCubed, University of Michigan
Subject
General Physics and Astronomy
Cited by
10 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献