The first dependable thermodynamic laws -- in the early 19th century -- were those of Fourier and Navier-Stokes for heat conduction and viscous friction. They were purely phenomenological and did not require the understanding of the nature of heat, let alone the concepts of energy and entropy.
Those concepts emerged in the works of Mayer, Joule, Helmholtz, Clausius and Boltzmann in the latter half of the 19th century and -- apart from making energy conversion more efficient -- they had a deep impact on natural philosophy: It was recognized that nature is governed by a universal conflict between determinacy, which forces energy into a minimum, and stochasticity, by which entropy tends to a maximum. The outcome of the competition is determined by temperature so that miminal energy requires cold systems and maximal entropy occurs in hot systems. Thus phenomena like phase transitions, planetary atmospheres, osmosis, phase diagrams and chemical reactions were all understood in terms of the universal competition of energy and entropy. Mixtures, alloys and solutions were systematically incorporated into thermodynamics by Gibbs. And so the early 20th century saw the development of a second vast field of industrial application, -- apart from energy conversion --, the rectification of naturally occuring mixtures like natural gas and mineral oil.
By the work of Eckart the phenomenological laws of Fourier, Navier-Stokes and Fick -- the latter for diffusion -- received a systematic derivation in terms of the Gibbs equation for the entropy. Thus equilibrium thermodynamics and thermodynamics of irreversible processes had come to a kind of conclusion by the mid 20th century. At the same time the limits of those fields were recognized: They required fairly dense matter which means that they could not describe rapidly changing processes and processes with steep gradients.
Some wishful thinking occured concerning stationary processes: The hypothesis of Onsager about the mean regression of fluctuations, and Prigogine´s principle on minimal entropy production.
Boltzmann´s kinetic theory of gases provides macroscopic balance laws in its hierarchy of moments of the distribution function. These have furnished the blueprint for extended thermodynamics which is capable of describing rapid changes and steep gradients, if enough moments are taken into account. The theory assumes that all field equations are balance laws with local and instantaneous constitutive relations for fluxes and productions. Thus follows a system of quasi-linear field equations and the entropy principle ensures that the system is symmetric hyperbolic, if written in the proper variables. That property guarantees
- existence and uniqueness of solutions,
- continuous dependence of solutions on the initial data and
- finite speeds of characteristic waves.
That is to say: The entropy principle ensures well-posedness of initial value problems.
Extended thermodynamics permits the calculation of light scattering spectra down to very rarefied gases. Moreover, it offers a deep insight into the structure of shocks and, above all, it reveals the very narrow limitations of linear irreversible thermodynamics based on the laws of Fourier, Fick and Navier-Stokes. Thus for instance extended thermodynamics shows that a gas in the gap between two cylinders cannot rotate rigidly, if heat conduction occurs between the inner and outer surface. Moreover, the temperature fields in the gap does not obey the Fourier law, even if there is no rotation.
Where can I find the discussion about the gas in the coaxial cylinders which cannot rotate?
Andrea
Thank you for your kind words.
The impossibility of a rigid rotation of a heat conducting gas between two cylinders is proved in a paper by
E.Barbera and I.Müller: Acta Mechanica 184 (2006)pp.205-216.
There was a preliminary short version by
E.Barbera and I.Müller: In: Rational continua, classical and new. Springer Milano (2002)pp.1-10.
A coraollary is rather surprising: A gas cannot be at rest between the cylinders in a rotating frame.
Best wishes, IM&WW
Thank you for your presentation. You made a comment on slide 4 about the Helmholtz function, A=E-TS, assigning energy minimization to deterministic evolution and entropy maximization to stochastic evolution. I have some questions regarding these assertions and how they relate to the doctrine of driving forces.
One can obtain A=E-TS as a Legendre transformation of the fundamental potential, E = TS-pV + sum F_i dX_i. This follows the historical attempts to cast thermodynamic potential in the form of general dynamic laws (Gibbs, Duhem, Massieu, etc.). In doing so, one seeks a potential in which the independent variables for variations around an equilibrium state become T and others, S now playing the role of a generalized force. Without prior conceptions about entropy from statistical physics, aren't we just to interpret S and A in the language of potentials and forces? I don't see how the distinction between determinism and stochasticity enters into the picture. If natural changes require decrease of A, it is clear that there would be a combination of TS above which A2/A1 would be negative. You introduce the doctrine of driving forces and fluxes later (slide 8), could you comment on the relevance of this slide to the competition between energy and TS you discussed before?
Thanks,
Ben Akih.
Our statement that "if there is no working, minimal energy and maximal entropy are conducive to equilibrium" is nothing else then an interpretation in words of the statement that E-T_0 S -> minimum in equilibrium. This is an immediate consequence of the first and second laws for the case when the heating is not zero.
Now why have we called the trend toward to minimal energy deterministic and the trend toward to maximal entropy stochastic? In answer to that question we say this; always referring to E-T0 S -> minimum:
If T_0 -> 0, thermodynamics become mechanics and in mechanics the forces attract the system toward to minimal energy. E.g.gravitation attracts all air molecules to the surface of the earth; we call that drive "deterministic" which surely is a good word for it.
If T0 is large, entropy kicks in and what is entropy? Entropy is a measure for the number of (a priori equi-probable) realizations of a state, which - "stochastically" - tends to a maximum. E.g. the entropy (number of realizations) of the atmosphere is biggest when the molecules are evenly distributed throughout space.
All of this has nothing to do with thermodynamic forces and thermodynamic fluxes. This comes later when our lecture prcoeeds to local irreversible phenomena rather than determination of global equilibria of bodies.
Legendre transform do not enter here, since neither E nor S are state functions nor is T_0 - the surface temperature - a variable in E or S.
Best wishes, IM & WW