Abstract
AbstractAccording to Shannon, a message x is a random event. Let p(x) be the probability of occurrence of event x. If $$p(x)=0$$
p
(
x
)
=
0
, this event does not occur; If $$p(x)=1$$
p
(
x
)
=
1
, this event must occur. When $$p(x) = 0$$
p
(
x
)
=
0
or $$p(x) = 1$$
p
(
x
)
=
1
, information x can be called trivial information or spam information. Therefore, the real mathematical significance of information x lies in its uncertainty, that is $$0<p(x)<1$$
0
<
p
(
x
)
<
1
. Quantitative research on the uncertainty of nontrivial information constitutes all the starting point of Shannon’s theory, this starting point is now called information quantity or information entropy, or entropy for short. Shannon and his colleagues at Bell laboratory considered “bit” as the basic quantitative unit of information. What is “bit”? We can simply understand it as the number of bits in the binary system. However, according to Shannon, the binary system with n digits can express up to $$2^{n}$$
2
n
numbers. From the point of view of probability and statistics, the probability of occurrence of these $$2^{n}$$
2
n
numbers is $$\frac{1}{2^{n}}$$
1
2
n
. Therefore, a bit is the amount of information contained in event x with probability $$\frac{1}{2}$$
1
2
. Taking this as the starting point, Shannon defined the self information I(x) contained in an information x as
Reference30 articles.
1. Bassoli, R., Marques, H., & Rodriguez, J. (2013). Network coding theory, a survey. IEEE Commun. Surveys Tutor., 15(4), 1950–1978.
2. Berger, T. (1971). Rate distortion theory: a mathematical basis for data compression. Prentice-Hall.
3. Blahut, R. E. P. (1965). Ergodic theory and informtion. Wiley.
4. Chung, K. L. (1961). A note on the ergodic theorem of information theory. Addison. Math Statist., 32, 612–614.
5. Cover, T. M., & Thomas, J. A. (1991). Elements of information theory. Wiley.