Please login first
Entropy: from Thermodynamics to Information Theory
* 1 , 2 , 3 , 1
1  University of São Paulo
2  Unesp
3  Versatus HPC

Abstract:

Entropy is a concept that remote to the 19th century. It was associated with the heat used by a thermal machine to realize work in the context of the Industrial Revolution. The 20th century saw an unprecedented scientific revolution, and one of the essential innovations from this time was Information Theory, which also has a concept of entropy. A natural question arises: ‘what is the difference, if any, between the entropies used in each field?’ The concept is misused misconceptions about the theme. There have been attempts to conciliate the entropy of thermodynamics with that of information theory. The most common use is defining entropy as “disorder”, however, it is not a good analogy since “order” is a subjective human concept, and “disorder” is not the measurement that can always be obtained with entropy. Another way is relating complexity to entropy. Computer science and statistics have boarded the problem of complexity utilizing Kolmogorov Complexity. Again, the abstraction level of the concept can make researchers from other areas, such as biologists and chemists (in which the study of complexity plays an essential role in misunderstood concepts). In this paper, the historical background for the evolution of “entropy” is presented, and mathematical proofs and logical arguments for the concept’s interconnection in various science areas.

Keywords: Thermodynamics; Information Theory; Statistics; Complexity; Kolmogorov Complexity
Top