The Concept of Entropy: Ubiquitous and ... Not Even Wrong
(trop ancien pour répondre)
Pentcho Valev
2017-05-18 07:16:13 UTC
The thermodynamic (Clausius's) entropy has been defined for systems in equilibrium. Yet in ninety-nine percent of the cases it is applied to non-equilibrium and even not well defined systems and processes (black holes, Big Bang events, etc.), which means that the respective interpretations make no sense (are not even wrong):

"Max Planck wrote that the phrase 'entropy of the universe' has no meaning because it admits of no accurate definition. More recently, Grandy writes: "It is rather presumptuous to speak of the entropy of a universe about which we still understand so little, and we wonder how one might define thermodynamic entropy for a universe and its major constituents that have never been in equilibrium in their entire existence." According to Tisza: "If an isolated system is not in equilibrium, we cannot associate an entropy with it." Buchdahl writes of "the entirely unjustifiable assumption that the universe can be treated as a closed thermodynamic system". According to Gallavotti: "... there is no universally accepted notion of entropy for systems out of equilibrium, even when in a stationary state." Discussing the question of entropy for non-equilibrium states in general, Lieb and Yngvason express their opinion as follows: "Despite the fact that most physicists believe in such a nonequilibrium entropy, it has so far proved impossible to define it in a clearly satisfactory way." https://en.wikipedia.org/wiki/Heat_death_of_the_universe

Unfortunately, even for equilibrium systems, statements involving the term "entropy" make no sense (are not even wrong). The reason is that the entropy is not a state function. If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a STATE FUNCTION FOR AN IDEAL GAS. Clausius was very impressed by this statefunctionness and decided to prove that the entropy (so defined) is a state function for ANY system. So "Entropy is a state function" became a fundamental theorem in thermodynamics. Clausius deduced it from the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this deduction remains the only justification of "Entropy is a state function":

"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero."

"Entropy Changes in Arbitrary Cycles. What if we have a process which occurs in a cycle other than the Carnot cycle, e.g., the cycle depicted in Fig. 3. If entropy is a state function, cyclic integral of dS = 0, no matter what the nature of the cycle. In order to see that this is true, break up the cycle into sub-cycles, each of which is a Carnot cycle, as shown in Fig. 3. If we apply Eq. (7) to each piece, and add the results, we get zero for the sum."

The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles. A cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles.

Conclusion: The belief that the entropy is a state function is totally unjustified. The part of thermodynamics based on the entropy concept is not even wrong.

"My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."

Clifford Truesdell, The Tragicomical History of Thermodynamics, 1822-1854, p. 6: "Finally, I confess to a heartfelt hope - very slender but tough - that even some thermodynamicists of the old tribe will study this book, master the contents, and so share in my discovery: Thermodynamics need never have been the Dismal Swamp of Obscurity that from the first it was and that today in common instruction it is; in consequence, it need not so remain." [...] p. 333: "Clausius' verbal statement of the "Second Law" makes no sense, for "some other change connected therewith" introduces two new and unexplained concepts: "other change" and "connection" of changes. Neither of these finds any place in Clausius' formal structure. All that remains is a Mosaic prohibition. A century of philosophers and journalists have acclaimed this commandment; a century of mathematicians have shuddered and averted their eyes from the unclean."

Jos Uffink, Bluff your way in the Second Law of Thermodynamics: "I therefore argue for the view that the second law has nothing to do with the arrow of time. [...] Before one can claim that acquaintance with the Second Law is as indispensable to a cultural education as Macbeth or Hamlet, it should obviously be clear what this law states. This question is surprisingly difficult. The Second Law made its appearance in physics around 1850, but a half century later it was already surrounded by so much confusion that the British Association for the Advancement of Science decided to appoint a special committee with the task of providing clarity about the meaning of this law. However, its final report (Bryan 1891) did not settle the issue. Half a century later, the physicist/philosopher Bridgman still complained that there are almost as many formulations of the second law as there have been discussions of it. And even today, the Second Law remains so obscure that it continues to attract new efforts at clarification."

Pentcho Valev
Pentcho Valev
2017-05-18 22:08:58 UTC
Clausius's entropy is not even wrong - accordingly, the Bekenstein-Hawking entropy is not even not even wrong:

"The Bekenstein-Hawking entropy or black hole entropy is the amount of entropy that must be assigned to a black hole in order for it to comply with the laws of thermodynamics as they are interpreted by observers external to that black hole. [...] In ordinary thermodynamics the second law requires that the entropy of a closed system shall never decrease, and shall typically increase as a consequence of generic transformations. While this law may hold good for a system including a black hole, it is not informative in its original form. For example, if an ordinary system falls into a black hole, the ordinary entropy becomes invisible to an exterior observer, so from her viewpoint, saying that ordinary entropy increases does not provide any insight: the ordinary second law is transcended. Including the black hole entropy in the entropy ledger gives a more useful law, the generalized second law of thermodynamics (GSL) (Bekenstein 1972, 1973, 1974): the sum of ordinary entropy So outside black holes and the total black hole entropy never decreases and typically increases as a consequence of generic transformations of the black hole. In equations ΔSo + ΔSBH ≥ 0."

I am sure Stephen Hawking and Jacob Bekenstein have never known that the version of the second law of thermodynamics stated as "Entropy always increases" (which, according to A. Eddington, holds "the supreme position among the laws of Nature") is in fact a theorem deduced by Clausius in 1865:

Jos Uffink, Bluff your Way in the Second Law of Thermodynamics, p. 37: "Hence we obtain: THE ENTROPY PRINCIPLE (Clausius' version) For every nicht umkehrbar [irreversible] process in an adiabatically isolated system which begins and ends in an equilibrium state, the entropy of the final state is greater than or equal to that of the initial state. For every umkehrbar [reversible] process in an adiabatical system, the entropy of the final state is equal to that of the initial state."

Clausius' deduction was based on three postulates:

Postulate 1 (implicit): The entropy is a state function.

Postulate 2: Clausius' inequality (formula 10 on p. 33 in Uffink's paper) is correct.

Postulate 3: Any irreversible process can be closed by a reversible process to become a cycle.

All the three postulates remain unproven even nowadays; Postulate 3 is almost obviously false:

Uffink, p.39: "A more important objection, it seems to me, is that Clausius bases his conclusion that the entropy increases in a nicht umkehrbar [irreversible] process on the assumption that such a process can be closed by an umkehrbar [reversible] process to become a cycle. This is essential for the definition of the entropy difference between the initial and final states. But the assumption is far from obvious for a system more complex than an ideal gas, or for states far from equilibrium, or for processes other than the simple exchange of heat and work. Thus, the generalisation to all transformations occurring in Nature is somewhat rash."

Note that, even if Clausius's theorem were correct (it is not), it only holds for "an adiabatically isolated system which begins and ends in an equilibrium state". Do processes Hawking and Bekenstein consider "begin and end in an equilibrium state"?

Pentcho Valev
Pentcho Valev
2017-05-19 15:35:13 UTC
The second law of thermodynamics is obviously false but a century and a half of brainwashing has blinded scientists to the obvious. Systems violating the second law are commonplace. For instance, water placed in a strong electric field demonstrates perpetual motion able to produce unlimited amount of work at the expense of heat absorbed from the surroundings (no other source of energy is conceivable):

Adam D. Wexler et al: "Two counter current flows are present in the bridge (figure 9b), one flowing over the other as they cross from one beaker to the other."

"The Formation of the Floating Water Bridge including electric breakdowns"

Scientists are fascinated by various features of the water-in-electric-field system but are totally unable to see the obvious fact that the work the flows can produce, e.g. by rotating a waterwheel and lifting weights, would be, in terms of its origin, converted ambient heat (no electric energy would be spent):

"Taking some more time to watch the bridge in action, one is stupefied by the complexity. The water movement is bidirectional, i.e., it simultaneously flows in both directions, and the shape and diameter of the bridge is constantly changing. The bridge also has a lot of strange optical properties like weak birefringence and mixed refractive index, which make it appear somewhat different than "ordinary" water, even to the naked eye! There is practically nothing ordinary about the water in an active floating bridge, and this is no esoteric experiment, as the strength and shape of the electric field we apply in the water bridge is nearly ubiquitous throughout nature. It turns out that if we examine the electric fields present in nature, such as those in living cells, around soil particles, or in clouds, we find that the field strengths are on the same order of magnitude - megavolts per meter. Which incidentally is the same in the water bridge, not to mention inside many electrochemical and biochemical fuel cells that are now being used to develop the next generation of resource recovery technologies. Megavolts per meter seems to be this kind of universal constant of field strength in aqueous systems. It's such an enticing observation that, during my defense, one of my opponents made a point to ask whether I thought this was just a coincidence or indicated some deeper truth that we are as yet unaware. Of course I had to answer the latter, as I am a firm believer that nature is quite deliberate in its construction and there are really no accidents." https://www.wetsus.nl/home/wetsus-news/more-than-just-a-party-trick-the-floating-water-bridge-holds-insight-into-nature-and-human-innovation/1

Pentcho Valev