Abstract
Entropy is a concept that emerged in the 19th century. It used to be associated with heat harnessed by a thermal machine to perform work during the Industrial Revolution. However, there was an unprecedented scientific revolution in the 20th century due to one of its most essential innovations, i.e., the information theory, which also encompasses the concept of entropy. Therefore, the following question is naturally raised: “what is the difference, if any, between concepts of entropy in each field of knowledge?” There are misconceptions, as there have been multiple attempts to conciliate the entropy of thermodynamics with that of information theory. Entropy is most commonly defined as “disorder”, although it is not a good analogy since “order” is a subjective human concept, and “disorder” cannot always be obtained from entropy. Therefore, this paper presents a historical background on the evolution of the term “entropy”, and provides mathematical evidence and logical arguments regarding its interconnection in various scientific areas, with the objective of providing a theoretical review and reference material for a broad audience.
Funder
Fundação de Amparo à Pesquisa do Estado de São Paulo
Conselho Nacional de Desenvolvimento Científico e Tecnológico
Subject
General Physics and Astronomy
Reference57 articles.
1. Elements of Information Theory;Cover,2012
2. Entropy;Greven,2014
3. General properties of entropy
4. A Mathematical Theory of Communication
5. Nonequilibrium Thermodynamics: Transport and Rate Processes in Physical, Chemical and Biological Systems;Demirel,2019
Cited by
21 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献