Please login first
Entropy: The Evolution of a Concept
1  University at Albany (SUNY), Albany, NY, USA

Abstract:

Entropy has been, and continues to be, one of the most misunderstood of physical concepts. This is because entropy is a quantification of one’s state of ignorance about a system rather than a quantification of some aspect of a system itself. In this talk I will begin by looking back at the history of entropy tracking the evolution of thought from Carnot’s generalized heat engine, Lord Kelvin’s temperature scale, Clausius’ entropy, to Boltzmann’s counting of microstates. This evolution in thought then took significant leaps as the concept of Shannon’s information was introduced and Jaynes, realizing that this was a matter of inductive inference, introduced the principle of maximum entropy. The concept of entropy continues to evolve as demonstrated by the relation between entropy and the relevance of questions. As a result, the future holds great promise as information-theoretic and entropic methods are justifiably and confidently applied to new problems in new domains far beyond those involving thermodynamics, statistical mechanics and communication theory. And we will see entropic techniques employed in new technologies, such as question-asking machines.

Keywords: entropy; information theory
Top