Entropy Is Nonsense in the Presence of Conservative Forces
(trop ancien pour répondre)
Pentcho Valev
2017-08-10 12:13:31 UTC
Raw Message
Brian Cox explains the concept of entropy:

Brian Cox explains why time travels in one direction - Wonders of the Universe - BBC Two

Note the implicit analogy between the sand grains and ideal gas molecules - in both cases the introduction of the entropy concept requires that interactions between particles be neglected. This is suggestive. If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS. Clausius was very impressed by this statefunctionness and decided to prove that the entropy (so defined) is a state function for ANY system. So "Entropy is a state function" became a fundamental theorem in thermodynamics. Clausius deduced it from the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this deduction remains the only justification of "Entropy is a state function":

"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero."

"Entropy Changes in Arbitrary Cycles. What if we have a process which occurs in a cycle other than the Carnot cycle, e.g., the cycle depicted in Fig. 3. If entropy is a state function, cyclic integral of dS = 0, no matter what the nature of the cycle. In order to see that this is true, break up the cycle into sub-cycles, each of which is a Carnot cycle, as shown in Fig. 3. If we apply Eq. (7) to each piece, and add the results, we get zero for the sum." http://ronispc.chem.mcgill.ca/ronis/chem213/hnd8.pdf

The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. A cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles.

Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about.

"My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."

Jos Uffink, Bluff your way in the Second Law of Thermodynamics: "I therefore argue for the view that the second law has nothing to do with the arrow of time. [...] This summary leads to the question whether it is fruitful to see irreversibility or time-asymmetry as the essence of the second law. Is it not more straightforward, in view of the unargued statements of Kelvin, the bold claims of Clausius and the strained attempts of Planck, to give up this idea? I believe that Ehrenfest-Afanassjewa was right in her verdict that the discussion about the arrow of time as expressed in the second law of the thermodynamics is actually a RED HERRING." http://philsci-archive.pitt.edu/313/1/engtot.pdf

Pentcho Valev
Pentcho Valev
2017-08-10 16:02:43 UTC
Raw Message
Clausius' assumption that any cycle can be dissected into small Carnot cycles is so silly that many thermodynamicists ignore it and just DEFINE the entropy as a state function, like Steve Carlip:

Steve Carlip: "Is c, the speed of light in vacuum, constant? At the 1983 Conference Generale des Poids et Mesures, the following SI (Systeme International) definition of the metre was adopted: The metre is the length of the path travelled by light in vacuum during a time interval of 1/299 792 458 of a second. This defines the speed of light in vacuum to be exactly 299,792,458 m/s. This provides a very short answer to the question "Is c constant": Yes, c is constant by definition!"

Technically both lies - "Entropy is a state function" and "The speed of light is constant" - are easy to disprove but sociologically this is virtually impossible. A century (even more) of repetition has converted the lies into absolute truths - they are now inherent in the culture of our civilization.

Pentcho Valev
Pentcho Valev
2017-08-11 00:07:30 UTC
Raw Message
The version of the second law of thermodynamics stated as "Entropy always increases" (a version which, according to A. Eddington, holds "the supreme position among the laws of Nature") is in fact a theorem deduced by Clausius in 1865:

Jos Uffink, Bluff your Way in the Second Law of Thermodynamics, p. 37: "Hence we obtain: THE ENTROPY PRINCIPLE (Clausius' version) For every nicht umkehrbar [irreversible] process in an adiabatically isolated system which begins and ends in an equilibrium state, the entropy of the final state is greater than or equal to that of the initial state. For every umkehrbar [reversible] process in an adiabatical system, the entropy of the final state is equal to that of the initial state." http://philsci-archive.pitt.edu/archive/00000313/

Clausius' deduction was based on three postulates:

Postulate 1 (implicit): The entropy is a state function.

Postulate 2: Clausius' inequality (formula 10 on p. 33 in Uffink's paper) is correct.

Postulate 3: Any irreversible process can be closed by a reversible process to become a cycle.

All the three postulates remain totally unjustified even nowadays; Postulate 3 is almost obviously false:

Uffink, p.39: "A more important objection, it seems to me, is that Clausius bases his conclusion that the entropy increases in a nicht umkehrbar [irreversible] process on the assumption that such a process can be closed by an umkehrbar [reversible] process to become a cycle. This is essential for the definition of the entropy difference between the initial and final states. But the assumption is far from obvious for a system more complex than an ideal gas, or for states far from equilibrium, or for processes other than the simple exchange of heat and work. Thus, the generalisation to all transformations occurring in Nature is somewhat rash."

Note that, even if Clausius's theorem were correct (it is not), it only holds for "an adiabatically isolated system which begins and ends in an equilibrium state". This means that all applications of "Entropy always increases" to processes which do not begin and end in equilibrium would be unjustified even if the theorem were correct!

Thermodynamics is even more idiotic than Einstein's relativity but, being long dead, is less detrimental:

Clifford Truesdell, The Tragicomical History of Thermodynamics, 1822-1854, p. 6: "Finally, I confess to a heartfelt hope - very slender but tough - that even some thermodynamicists of the old tribe will study this book, master the contents, and so share in my discovery: Thermodynamics need never have been the Dismal Swamp of Obscurity that from the first it was and that today in common instruction it is; in consequence, it need not so remain." [...] p. 333: "Clausius' verbal statement of the "Second Law" makes no sense, for "some other change connected therewith" introduces two new and unexplained concepts: "other change" and "connection" of changes. Neither of these finds any place in Clausius' formal structure. All that remains is a Mosaic prohibition. A century of philosophers and journalists have acclaimed this commandment; a century of mathematicians have shuddered and averted their eyes from the unclean." https://www.amazon.com/Tragicomical-Thermodynamics-1822-1854-Mathematics-Physical/dp/1461394465

Pentcho Valev