Pentcho Valev

2017-02-10 08:59:36 UTC

Permalink

I am stationary on Earth, you move past me at great speed and check your clocks against mine. What will you discover? Time speeds up or slows down for you?Raw Message

According to Einstein's special relativity, time for you SPEEDS UP - you discover that my clocks are slow and your clocks are FAST. Yet Einsteinians teach the opposite:

Brian Greene: "If you're moving relative to somebody else, time for you slows down."

http://abcnews.go.com/Entertainment/back-future-30th-anniversary-neil-degrasse-tyson-talks/story?id=32191481

Neil deGrasse Tyson: "We have ways of moving into the future. That is to have time tick more slowly for you than others, who you return to later on. We've known that since 1905, Einstein's special theory of relativity, which gives the precise prescription for how time would slow down for you if you are set into motion."

https://cosmosmagazine.com/physical-sciences/five-ways-travel-through-time

"This is the easiest and most practical way to get to the far future - go really fast. According to Einstein's theory of special relativity, when you travel at speeds approaching the speed of light, time slows down for you relative to the outside world."

http://www.newscientist.com/article/mg13117878.000-a-special-theory-of-relativity.html

John Gribbin: "Einstein's special theory of relativity tells us how the Universe looks to an observer moving at a steady speed. Because the speed of light is the same for all such observers, moving clocks run slow..."

Brian Cox (2:25) : "Moving clocks run slowly"

This is not an ordinary lie - in a sense, it is not a lie at all. It is an idiocy, and the sad thing is that the "truth" - time for you SPEEDS UP - is an idiocy as well. All consequences of Einstein's 1905 false constant-speed-of-light postulate, validly or invalidly deduced, are idiotic.

Einstein did not receive his Nobel prize for producing such idiocies. He received it for speculations involving the entropy concept. Were those speculations less idiotic?

Entropy is not a state function. This means that any statement involving the term "entropy" is not even wrong. If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a STATE FUNCTION FOR AN IDEAL GAS. Clausius was very impressed by this statefunctionness and decided to prove that the entropy (so defined) is a state function for ANY system. So "Entropy is a state function" became a fundamental theorem in thermodynamics. Clausius deduced it from the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this deduction remains the only justification of "Entropy is a state function":

http://mutuslab.cs.uwindsor.ca/schurko/introphyschem/lectures/240_l10.pdf

"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero."

http://ronispc.chem.mcgill.ca/ronis/chem213/hnd8.pdf

"Entropy Changes in Arbitrary Cycles. What if we have a process which occurs in a cycle other than the Carnot cycle, e.g., the cycle depicted in Fig. 3. If entropy is a state function, cyclic integral of dS = 0, no matter what the nature of the cycle. In order to see that this is true, break up the cycle into sub-cycles, each of which is a Carnot cycle, as shown in Fig. 3. If we apply Eq. (7) to each piece, and add the results, we get zero for the sum."

The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles. A cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles.

Conclusion: The belief that the entropy is a state function is totally unjustified. The part of thermodynamics based on the entropy concept is not even wrong.

https://en.wikipedia.org/wiki/History_of_entropy

"My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."

Pentcho Valev