Author:
Aczél J.,Forte B.,Ng C. T.
Abstract
The following properties of entropies, as measures of expected information, seem natural. The amount of information expected from an experiment does not change if we add outcomes of zero probability (expansibility). The expected information is symmetric in the (probabilities of the) outcomes. The information expected from a combination of two experiments is less than or equal to the sum of the informations expected from the single experiments (subadditivity); equality holds here if the two experiments are independent (additivity).In this paper it is shown that linear combinations of the Shannon and Hartley entropies and only these have the above properties. The Shannon and the Hartley entropies are also individually characterized.
Publisher
Cambridge University Press (CUP)
Subject
Applied Mathematics,Statistics and Probability
Reference21 articles.
1. A New Derivation of the Information Function.
2. On the Axioms of Information Theory
3. Sur la caractérisation axiomatique des entropies d'ordre positif, y comprise l'entropie de Shannon;Aczél;C. R. Acad. Sci. Paris,1963
Cited by
92 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献