Abstract
Source coding maps elements from an information source to a sequence of alphabetic symbols. Then, the source symbols can be recovered exactly from the binary units. In this paper, we derive an approach that includes information variation in the source coding. The approach is more realistic than its standard version. We employ the Shannon entropy for coding the sequences of a source. Our approach is also helpful for short sequences when the central limit theorem does not apply. We rely on a quantifier of the information variation as a source. This quantifier corresponds to the second central moment of a random variable that measures the information content of a source symbol; that is, considering the standard deviation. An interpretation of typical sequences is also provided through this approach. We show how to use a binary memoryless source as an example. In addition, Monte Carlo simulation studies are conducted to evaluate the performance of our approach. We apply this approach to two real datasets related to purity and wheat prices in Brazil.
Funder
National Council for Scientific and Technological Development
Subject
General Mathematics,Engineering (miscellaneous),Computer Science (miscellaneous)
Reference44 articles.
1. Ben-Naim, A. (2008). A Farewell to Entropy: Statistical Thermodynamics Based on Information, World Scientific.
2. Kafri, O., and Kafri, H. (2013). Entropy: God’s Dice Game, CreateSpace Independent Publishing Platform.
3. Research synthesis of information theory measures of uncertainty: Meta-analysis of entropy and mutual information of diagnostic tests;Tsalatsanis;J. Eval. Clin. Pract.,2021
4. A mathematical theory of communication;Shannon;Bell Syst. Tech. J.,1948
5. Estimation of the probability function under special moments conditions using the maximum Shannon and Tsallis entropies;Nikooravesh;Chil. J. Stat.,2018
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献