﻿ Non-Deductive (Not Even Wrong) Models in Physics
Discussion:
Non-Deductive (Not Even Wrong) Models in Physics
Pentcho Valev
2017-06-02 10:42:15 UTC
Réponse
Raw Message
"So what about the second criticism, that inflation is too flexible to be tested? It's true that while the idea behind inflation is simple, its parameters can be tweaked in seemingly endless ways... [...] In other words, the critics say, go out and measure almost anything and someone will say, "hey, that's evidence for inflation." Theories that can predict anything predict nothing. Inflation, they say, isn't science. [...] But supporters argue that this shows a fundamental misunderstanding of what inflation is. It's not a single model, they insist - it's a class of models, a sweeping principle, a paradigm from which individual models can be derived and then tested. The key is to figure out which model of inflation is right - if any - and not to prove or falsify all of them all in one fell swoop. "Each model makes specific predictions, and can be tested with precision by the traditional methods of empirical science," says Guth, now at MIT." http://nautil.us/issue/48/chaos/the-inflated-debate-over-cosmic-inflation

My comment in Nautilus:

Guth doesn't realize that his counterargument is ridiculous and that he actually confirms the thesis of the critics? Strange.

By the way, tweaking the parameters in endless ways until you get anything you want is only possible for non-deductive models - e.g. Einstein's general relativity. (Special relativity is deductive so tweaking is unthinkable - you cannot introduce changes not deducible from the postulates.) Non-deductive models in physics are essentially equivalent to the "empirical models" defined here:

http://collum.chem.cornell.edu/documents/Intro_Curve_Fitting.pdf
"The objective of curve fitting is to theoretically describe experimental data with a model (function or equation) and to find the parameters associated with this model. Models of primary importance to us are mechanistic models. Mechanistic models are specifically formulated to provide insight into a chemical, biological, or physical process that is thought to govern the phenomenon under study. Parameters derived from mechanistic models are quantitative estimates of real system properties (rate constants, dissociation constants, catalytic velocities etc.). It is important to distinguish mechanistic models from empirical models that are mathematical functions formulated to fit a particular curve but whose parameters do not necessarily correspond to a biological, chemical or physical property."

Pentcho Valev
Pentcho Valev
2017-06-02 13:10:38 UTC
Réponse
Raw Message
The problem of non-deductiveness is known as the problem of unfalsifiability:

Sabine Hossenfelder: "Many of my colleagues believe this forest of theories will eventually be chopped down by data. But in the foundations of physics it has become extremely rare for any model to be ruled out. The accepted practice is instead to adjust the model so that it continues to agree with the lack of empirical support."

"But inflation's biggest crime was its flexibility: The authors argue that inflation contains so many hypotheses that you can essentially fit at least one of them around any new data that comes out. In short, inflation can never be disproved. People studying it, therefore, are Not Doing Science." https://www.wired.com/2017/05/physicists-cant-agree-science-even-means-anymore/

"This year, debates in physics circles took a worrying turn. Faced with difficulties in applying fundamental theories to the observed Universe, some researchers called for a change in how theoretical physics is done. They began to argue - explicitly - that if a theory is sufficiently elegant and explanatory, it need not be tested experimentally, breaking with centuries of philosophical tradition of defining scientific knowledge as empirical." http://www.nature.com/news/scientific-method-defend-the-integrity-of-physics-1.16535

"Do physicists need empirical evidence to confirm their theories? You may think that the answer is an obvious yes, experimental confirmation being the very heart of science. But a growing controversy at the frontiers of physics and cosmology suggests that the situation is not so simple. [...] ...a mounting concern in fundamental physics: Today, our most ambitious science can seem at odds with the empirical methodology that has historically given the field its credibility." http://www.nytimes.com/2015/06/07/opinion/a-crisis-at-the-edge-of-physics.html

"In recent years, however, many physicists have developed theories of great mathematical elegance, but which are beyond the reach of empirical falsification, even in principle. The uncomfortable question that arises is whether they can still be regarded as science. Some scientists are proposing that the definition of what is "scientific" be loosened, while others fear that to do so could open the door for pseudo-scientists or charlatans to mislead the public and claim equal space for their views." http://www.prospectmagazine.co.uk/features/what-happens-when-we-cant-test-scientific-theories

Sabine Hossenfelder: "The problem that nobody seems to want to talk about is that rather than trying to find a minimal model that explains the data and leave it at this, there are many hundreds of models for inflation all of which are almost certainly wrong because they contain too many details that aren't supported by data. As the philosophers have it, these models are severely underdetermined. Theoretical physicists produce these models literally because they can make money with it. They make money with it by getting them published and then using the publications to claim it's relevant research so it'll get funded and they can hire more postdocs to crunch out more papers. It's the same reason why theorists invent dark matter particles and extensions of the standard model. It's a way to make a living." http://www.math.columbia.edu/~woit/wordpress/?p=9349

Pentcho Valev
Pentcho Valev
2017-06-02 23:10:57 UTC
Réponse
Raw Message
Lubos Motl: "Quantum mechanics is another example of deductive reasoning. [...] Only the implications "IF... THEN..." are guaranteed to hold according to the quantum mechanical laws of physics." http://motls.blogspot.bg/2017/05/quantum-mechanics-is-another-example-of.html

Motl is avoiding any mentioning of Einstein's relativity - perhaps he now knows that general relativity is not deductive. And the only alternative to deductive theory is empirical (in the sense defined by Einstein below) concoction - a "theory" that is not even wrong and is therefore unfalsifiable. Einstein clearly explains the difference between deduction and empirical approach here:

https://www.marxists.org/reference/archive/einstein/works/1910s/relative/ap03.htm
Albert Einstein: "From a systematic theoretical point of view, we may imagine the process of evolution of an empirical science to be a continuous process of induction. Theories are evolved and are expressed in short compass as statements of a large number of individual observations in the form of empirical laws, from which the general laws can be ascertained by comparison. Regarded in this way, the development of a science bears some resemblance to the compilation of a classified catalogue. It is, as it were, a purely empirical enterprise. But this point of view by no means embraces the whole of the actual process ; for it slurs over the important part played by intuition and deductive thought in the development of an exact science. As soon as a science has emerged from its initial stages, theoretical advances are no longer achieved merely by a process of arrangement. Guided by empirical data, the investigator rather develops a system of thought which, in general, is built up logically from a small number of fundamental assumptions, the so-called axioms."

Special relativity was indeed "built up logically from a small number of fundamental assumptions", that is, it was deductive (even though a false postulate and an invalid argument spoiled it from the very beginning), but general relativity was, to use Einstein's words, "a purely empirical enterprise". Einstein and his mathematical friends changed and fudged equations countless times until "a classified catalogue" was compiled where known in advance results and pet assumptions (such as the Mercury's precession, the equivalence principle, gravitational time dilation) coexisted in an apparently consistent manner. Being an empirical concoction, general relativity allows Einsteinians to introduce, change and withdraw fudge factors until the "theory" manages to predict anything Einsteinians want. Then the prediction turns out to be confirmed by observations (surprise surprise).

So, as far as the unfalsifiability problem in theoretical physics is concerned, THE ROOT OF THE MALIGNANCY IS GENERAL RELATIVITY. Here are the metastases:

"Almost 40 years after their inception, inflation and string theory are in worse shape than ever. The persistence of these unfalsifiable and hence unscientific theories is an embarrassment that risks damaging science's reputation at a time when science can ill afford it. Isn't it time to pull the plug?" https://blogs.scientificamerican.com/cross-check/is-a-popular-theory-of-cosmic-creation-pseudoscience/

"This year, debates in physics circles took a worrying turn. Faced with difficulties in applying fundamental theories to the observed Universe, some researchers called for a change in how theoretical physics is done. They began to argue - explicitly - that if a theory is sufficiently elegant and explanatory, it need not be tested experimentally, breaking with centuries of philosophical tradition of defining scientific knowledge as empirical." http://www.nature.com/news/scientific-method-defend-the-integrity-of-physics-1.16535

"Do physicists need empirical evidence to confirm their theories? You may think that the answer is an obvious yes, experimental confirmation being the very heart of science. But a growing controversy at the frontiers of physics and cosmology suggests that the situation is not so simple. [...] ...a mounting concern in fundamental physics: Today, our most ambitious science can seem at odds with the empirical methodology that has historically given the field its credibility." http://www.nytimes.com/2015/06/07/opinion/a-crisis-at-the-edge-of-physics.html

"In recent years, however, many physicists have developed theories of great mathematical elegance, but which are beyond the reach of empirical falsification, even in principle. The uncomfortable question that arises is whether they can still be regarded as science. Some scientists are proposing that the definition of what is "scientific" be loosened, while others fear that to do so could open the door for pseudo-scientists or charlatans to mislead the public and claim equal space for their views." http://www.prospectmagazine.co.uk/features/what-happens-when-we-cant-test-scientific-theories

The fudge-factor activity is inglorious and Einsteinians don't discuss it openly, but sometimes the truth comes out inadvertently. So conventional dark matter models based on general relativity "need four free parameters to be adjusted to explain the data" (how many fudge factors LIGO conspirators needed in order to model the nonexistent gravitational waves is a deep mystery):

https://www.newscientist.com/article/2116446-first-test-of-rival-to-einsteins-gravity-kills-off-dark-matter/
"Verlinde's calculations fit the new study's observations without resorting to free parameters – essentially values that can be tweaked at will to make theory and observation match. By contrast, says Brouwer, conventional dark matter models need four free parameters to be adjusted to explain the data."

Being an empirical concoction, Einstein's general relativity has no postulates:

https://www.quora.com/What-are-the-postulates-of-General-Relativity
What are the postulates of General Relativity? Alexander Poltorak, Adjunct Professor of Physics at the CCNY: "In 2005 I started writing a paper, "The Four Cornerstones of General Relativity on which it doesn't Rest." Unfortunately, I never had a chance to finish it. The idea behind that unfinished article was this: there are four principles that are often described as "postulates" of General Relativity:

1. Principle of general relativity

2. Principle of general covariance

3. Equivalence principle

4. Mach principle

The truth is, however, that General Relativity is not really based on any of these "postulates" although, without a doubt, they played important heuristic roles in the development of the theory." [end of quotation]

Essentially, general relativity is equivalent to the "empirical models" defined here (that is, it is as much a theory as they are):

http://collum.chem.cornell.edu/documents/Intro_Curve_Fitting.pdf
"The objective of curve fitting is to theoretically describe experimental data with a model (function or equation) and to find the parameters associated with this model. Models of primary importance to us are mechanistic models. Mechanistic models are specifically formulated to provide insight into a chemical, biological, or physical process that is thought to govern the phenomenon under study. Parameters derived from mechanistic models are quantitative estimates of real system properties (rate constants, dissociation constants, catalytic velocities etc.). It is important to distinguish mechanistic models from empirical models that are mathematical functions formulated to fit a particular curve but whose parameters do not necessarily correspond to a biological, chemical or physical property."

Here Michel Janssen describes the anti-deductive approach of Einstein and his mathematical friends - endlessly adjusting the model until "excellent agreement with observation" is reached:

https://netfiles.umn.edu/users/janss011/home%20page/EBms.pdf

Pentcho Valev
Pentcho Valev
2017-06-05 14:58:49 UTC
Réponse