Pentcho Valev

2017-05-18 07:16:13 UTC

The thermodynamic (Clausius's) entropy has been defined for systems in equilibrium. Yet in ninety-nine percent of the cases it is applied to non-equilibrium and even not well defined systems and processes (black holes, Big Bang events, etc.), which means that the respective interpretations make no sense (are not even wrong):

"Max Planck wrote that the phrase 'entropy of the universe' has no meaning because it admits of no accurate definition. More recently, Grandy writes: "It is rather presumptuous to speak of the entropy of a universe about which we still understand so little, and we wonder how one might define thermodynamic entropy for a universe and its major constituents that have never been in equilibrium in their entire existence." According to Tisza: "If an isolated system is not in equilibrium, we cannot associate an entropy with it." Buchdahl writes of "the entirely unjustifiable assumption that the universe can be treated as a closed thermodynamic system". According to Gallavotti: "... there is no universally accepted notion of entropy for systems out of equilibrium, even when in a stationary state." Discussing the question of entropy for non-equilibrium states in general, Lieb and Yngvason express their opinion as follows: "Despite the fact that most physicists believe in such a nonequilibrium entropy, it has so far proved impossible to define it in a clearly satisfactory way." https://en.wikipedia.org/wiki/Heat_death_of_the_universe

Unfortunately, even for equilibrium systems, statements involving the term "entropy" make no sense (are not even wrong). The reason is that the entropy is not a state function. If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a STATE FUNCTION FOR AN IDEAL GAS. Clausius was very impressed by this statefunctionness and decided to prove that the entropy (so defined) is a state function for ANY system. So "Entropy is a state function" became a fundamental theorem in thermodynamics. Clausius deduced it from the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this deduction remains the only justification of "Entropy is a state function":

http://mutuslab.cs.uwindsor.ca/schurko/introphyschem/lectures/240_l10.pdf

"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero."

http://ronispc.chem.mcgill.ca/ronis/chem213/hnd8.pdf

"Entropy Changes in Arbitrary Cycles. What if we have a process which occurs in a cycle other than the Carnot cycle, e.g., the cycle depicted in Fig. 3. If entropy is a state function, cyclic integral of dS = 0, no matter what the nature of the cycle. In order to see that this is true, break up the cycle into sub-cycles, each of which is a Carnot cycle, as shown in Fig. 3. If we apply Eq. (7) to each piece, and add the results, we get zero for the sum."

The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles. A cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles.

Conclusion: The belief that the entropy is a state function is totally unjustified. The part of thermodynamics based on the entropy concept is not even wrong.

https://en.wikipedia.org/wiki/History_of_entropy

"My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."

https://www.amazon.com/Tragicomical-Thermodynamics-1822-1854-Mathematics-Physical/dp/1461394465

Clifford Truesdell, The Tragicomical History of Thermodynamics, 1822-1854, p. 6: "Finally, I confess to a heartfelt hope - very slender but tough - that even some thermodynamicists of the old tribe will study this book, master the contents, and so share in my discovery: Thermodynamics need never have been the Dismal Swamp of Obscurity that from the first it was and that today in common instruction it is; in consequence, it need not so remain." [...] p. 333: "Clausius' verbal statement of the "Second Law" makes no sense, for "some other change connected therewith" introduces two new and unexplained concepts: "other change" and "connection" of changes. Neither of these finds any place in Clausius' formal structure. All that remains is a Mosaic prohibition. A century of philosophers and journalists have acclaimed this commandment; a century of mathematicians have shuddered and averted their eyes from the unclean."

http://philsci-archive.pitt.edu/313/1/engtot.pdf

Jos Uffink, Bluff your way in the Second Law of Thermodynamics: "I therefore argue for the view that the second law has nothing to do with the arrow of time. [...] Before one can claim that acquaintance with the Second Law is as indispensable to a cultural education as Macbeth or Hamlet, it should obviously be clear what this law states. This question is surprisingly difficult. The Second Law made its appearance in physics around 1850, but a half century later it was already surrounded by so much confusion that the British Association for the Advancement of Science decided to appoint a special committee with the task of providing clarity about the meaning of this law. However, its final report (Bryan 1891) did not settle the issue. Half a century later, the physicist/philosopher Bridgman still complained that there are almost as many formulations of the second law as there have been discussions of it. And even today, the Second Law remains so obscure that it continues to attract new efforts at clarification."

Pentcho Valev

"Max Planck wrote that the phrase 'entropy of the universe' has no meaning because it admits of no accurate definition. More recently, Grandy writes: "It is rather presumptuous to speak of the entropy of a universe about which we still understand so little, and we wonder how one might define thermodynamic entropy for a universe and its major constituents that have never been in equilibrium in their entire existence." According to Tisza: "If an isolated system is not in equilibrium, we cannot associate an entropy with it." Buchdahl writes of "the entirely unjustifiable assumption that the universe can be treated as a closed thermodynamic system". According to Gallavotti: "... there is no universally accepted notion of entropy for systems out of equilibrium, even when in a stationary state." Discussing the question of entropy for non-equilibrium states in general, Lieb and Yngvason express their opinion as follows: "Despite the fact that most physicists believe in such a nonequilibrium entropy, it has so far proved impossible to define it in a clearly satisfactory way." https://en.wikipedia.org/wiki/Heat_death_of_the_universe

Unfortunately, even for equilibrium systems, statements involving the term "entropy" make no sense (are not even wrong). The reason is that the entropy is not a state function. If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a STATE FUNCTION FOR AN IDEAL GAS. Clausius was very impressed by this statefunctionness and decided to prove that the entropy (so defined) is a state function for ANY system. So "Entropy is a state function" became a fundamental theorem in thermodynamics. Clausius deduced it from the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this deduction remains the only justification of "Entropy is a state function":

http://mutuslab.cs.uwindsor.ca/schurko/introphyschem/lectures/240_l10.pdf

"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero."

http://ronispc.chem.mcgill.ca/ronis/chem213/hnd8.pdf

"Entropy Changes in Arbitrary Cycles. What if we have a process which occurs in a cycle other than the Carnot cycle, e.g., the cycle depicted in Fig. 3. If entropy is a state function, cyclic integral of dS = 0, no matter what the nature of the cycle. In order to see that this is true, break up the cycle into sub-cycles, each of which is a Carnot cycle, as shown in Fig. 3. If we apply Eq. (7) to each piece, and add the results, we get zero for the sum."

The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles. A cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles.

Conclusion: The belief that the entropy is a state function is totally unjustified. The part of thermodynamics based on the entropy concept is not even wrong.

https://en.wikipedia.org/wiki/History_of_entropy

"My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."

https://www.amazon.com/Tragicomical-Thermodynamics-1822-1854-Mathematics-Physical/dp/1461394465

Clifford Truesdell, The Tragicomical History of Thermodynamics, 1822-1854, p. 6: "Finally, I confess to a heartfelt hope - very slender but tough - that even some thermodynamicists of the old tribe will study this book, master the contents, and so share in my discovery: Thermodynamics need never have been the Dismal Swamp of Obscurity that from the first it was and that today in common instruction it is; in consequence, it need not so remain." [...] p. 333: "Clausius' verbal statement of the "Second Law" makes no sense, for "some other change connected therewith" introduces two new and unexplained concepts: "other change" and "connection" of changes. Neither of these finds any place in Clausius' formal structure. All that remains is a Mosaic prohibition. A century of philosophers and journalists have acclaimed this commandment; a century of mathematicians have shuddered and averted their eyes from the unclean."

http://philsci-archive.pitt.edu/313/1/engtot.pdf

Jos Uffink, Bluff your way in the Second Law of Thermodynamics: "I therefore argue for the view that the second law has nothing to do with the arrow of time. [...] Before one can claim that acquaintance with the Second Law is as indispensable to a cultural education as Macbeth or Hamlet, it should obviously be clear what this law states. This question is surprisingly difficult. The Second Law made its appearance in physics around 1850, but a half century later it was already surrounded by so much confusion that the British Association for the Advancement of Science decided to appoint a special committee with the task of providing clarity about the meaning of this law. However, its final report (Bryan 1891) did not settle the issue. Half a century later, the physicist/philosopher Bridgman still complained that there are almost as many formulations of the second law as there have been discussions of it. And even today, the Second Law remains so obscure that it continues to attract new efforts at clarification."

Pentcho Valev