söndag 7 september 2025

Crisis of Modern Physics: Split Realism vs Formalism

The crisis of modern physics witnessed by many manifests itself in a split between academic departments:

  • Physics: Instrumentalism/formalism/epistemology (what we can say).
  • Philosophy: Realism (what is).
A realist philosopher is not welcome at a physics department, and what would an instrumentalist physicist do at a philosophy department?

A split between physics and philosophy of physics indicates that something is fundamentally wrong, and that comes out as a crisis. What is then fundamentally wrong?

Let us search the root of trouble in the formation of modern physics in the beginning of the 20th century in the new fields of Einstein's Special Theory of Relativity SR and Quantum Mechanics QM. 

Both SR and QM express instrumentalism and formalism as being focussed on measurement assuming a certain formal structure (Lorentz invariance and Hilbert space structure) where the real nature of physics is left open because it is believed to be hidden to inspection. The focus is thus on epistemology as what a physicist can measure and report (to motivate public funding). This is the physics performed at the Large Hadron Collider at CERN in Geneva. Very expensive with real physics hidden in a blip on a screen. 

But the question of what physics is as ontology of reality remains, and paradoxically that is what philosophers of physics outside physics departments focus on (Reichenbach, Bell, Maudlin, Brown).  

So is there any hope to get out of the crisis by joining departments of physics and philosophy of physics into one?

Can SR and QM be reformulated into theories about reality, which start from real physics instead of formalism? 

Any theory about reality must start from some fundamental reality expressed in Postulates of the theory. If the Postulates carry no physics, a theory based on the Postulates using logic cannot carry any physics. 

The Postulates of SR are 
  1. Physical laws are Lorentz invariant.
  2. Speed of light is to be measured by physicists according to SI standard to give exactly the value 299,792,458 metres per second.
We see that the Postulates of SR are like commands to be followed by physicists but say nothing precise about any physics. Therefore SR does not say anything about physics, unless physics somehow is added to the Postulates. And that is what Einstein did by using a "thought experiment" to conclude that two light signals viewed by two observers in fact are the same and so must connect by a Lorentz transformation. But the conclusion of the same had no physical basis and so was picked from the sky suddenly adding physics to the Postulates, but then physics without reality. 

The Postulates of QM were formalised by the mathematician von Neumann into a set of abstract axioms:

  1. State space: A system corresponds to a Hilbert space. States are rays (or density operators) in it.
  2. Observables: Physical quantities are self-adjoint operators on the Hilbert space.
  3. Measurements: Outcomes are eigenvalues; probabilities are given by the Born rule.
  4. Dynamics: Time evolution is unitary, governed by the Schrödinger equation.

We see that 1-3 are like commands to quantum physicists, without concern to real physics. Von Neumann did this during the heydays of Hilbert's formalism in the 1930s, which however soon died because of Gödel.

Altogether, we see that SR and QM are not realist theories starting from what is as ontology, but have clear qualities of formalism/epistemology as what we can say. The trouble with formalism is that there is no reality to decide and so the discussion can continue forever like in medieval scholastics. 

My contributions to a realist restart are: 

MMR starts from a reality where different observers use different coordinate systems and seeks what agreement can reached. 

RealQM starts from a classical realist continuum model of systems of charge densities in shared 3d Euclidean space interacting by Coulomb potentials as a new type of Schrödinger equation. 

Both MMR and RealQM represent realism as what is and so express unification of physics and philosophy of physics.  

Here are three steps to formalism away from realism:
  • Planck introduces smallest quanta $h\nu$ in 1900.
  • Einstein introduces photon as quanta of light $h\nu$ in 1905.
  • Heisenberg introduces QM as matrix mechanics in 1925.
In 1927 Schrödinger left QM because realism or "Anschaulichkeit" was lacking. Schrödinger's equation for the Hydrogen atom is a realist model, but for atoms with more than one electron it is a formalist model without physics. 
 

lördag 6 september 2025

Atmosphere as Air Conditioner Keeping Earth Temperature Constant

My journey into climate science started in 2010 with this analysis of black body radiation leading to an analysis of the atmosphere of the Earth as a form of air conditioner keeping the Earth surface mean temperature stable under varying mean heating from the Sun. My work was published as two chapters of the (ground-breaking) book Slaying the Sky Dragon - Death of the Greenhouse Gas Theory:

The basic idea is that incoming energy to the Earth surface at 288 K of about 160 W/m2 from the Sun is transported to the mid troposphere at an altitude of 5 Km at 255 K by a combination of H20 thermodynamics with phase change (evaporation/condensation) with a minor contribution of radiation, for radiation to outer space at 0 K. The variation of incoming energy to the surface can depend on varying cloud cover. This is the scenario in tropical zones receiving most of the energy with sunny mornings followed by thunderstorms in the afternoon.

An increase of incoming energy to the surface is counterbalanced by more intense H2O thermodynamics keeping temperatures constant. Radiation then takes a passive role as constant under constant temperature. 

This is like an air conditioner keeping a stable room temperature of 15 C with constant outside temperature 0 C under variable interior heating of the room e g depending of number of people in the room. 

It also connects to boiling of water on a stove keeping a stable boiling temperature of 100 C under varying energy input from the stove, with more vigorous boiling with phase change responding to increasing input.  

The Sky Dragon analysis above from 2010 was written after a very quick introduction to the so called Greenhouse Effect, but I think it captures aspects valid also today. 

Tropical climate: Raising hot humid air in the morning releasing heat to the atmosphere by condensation effectively transporting  heat energy from surface to atmosphere as a cooling air conditioner.




Boiling water kept at 100 C under heating from stove by evaporation.

The simplest model consists of heat conduction through a wall of thickness 1 with heat conductivity $\kappa $ and temperature $T(x)$ varying linearly from $T_0=1$ at $x=0$ and $T_1=0$ at $x=1$ with heat flux $Q=\kappa \frac{dT}{dx}=\kappa$. Increasing $Q$ is balanced by increase of $\kappa$ without changing $T(x)$ an increase of more vigorous thermodynamics or boiling. 

fredag 5 september 2025

Understanding OLR and DLR vs Radiance Measurement by CERES and Pyrgeometer.

Outgoing Longwave Radiation OLR from the Top of the Atmosphere ToA is measured by a CERES satellite looking down on ToA equipped with a sensor as one end of a thermocouple with its other end kept at a steady temperature generating a voltage scaling with the temperature difference at its ends. 

The CERES instrument is calibrated by determining a gain factor from sensor temperature to radiance letting the instrument look at a black body of known temperature $T_B$ with assumed Planck radiation $\sigma T_B^4$ while recording the sensor temperature. With the gain factor so determined the instrument reports radiance from ToA from a reading of sensor temperature. This is the simplest form of calibration assuming linearity. Very primitive technique, where the details of the instrument do not matter. It is like measuring intensity of rainfall using your hands to collect water calibrated to a faucet. The accuracy is at best 1 W/m2 or 0.4% of the same size as estimated Earth Energy Imbalance from CO2.  

A pyrgeometer measuring Downwelling Longwave Radiation from the atmosphere to the Earth surface also uses a sensor as one end of a thermocouple with the other end kept a base temperature, and also measures a voltage scaling with temperature difference. The calibration is here different because the outgoing radiation from the sensor can no longer be included in the calibration process, but has to be supplied through a Planck formula $\epsilon\sigma T^4$ with $T$ sensor temperature and $\epsilon$ sensor emissivity. The accuracy is at best 5 W/m2 again too big to detect global warming if present.

OLR and DLR are thus measured in a similar way, but with different forms of calibration the difference being that OLR faces empty space ay 0 K, while DLR faces the Earth surface. The accuracy is not enough to decide any size of global warming, although it is claimed that trends can be detected. 

In both cases Planck's Law in the form $\sigma T^4$ is used, which in the case of DLR is incorrect because the correct form is $\sigma (T^4-T_E^4)$ with $T_E$ Earth temperature expressing negative DLR.  

Summary: Measurements of OLR and DLR are made to detect global warming. The accuracy of the instruments is not good enough to detect any warming if present. DLR measurements can be questioned since an incorrect Planck Law is used. OLR and DLR as radiance as process variable fluctuate and as such are difficult to measure.  

EEI is a cornerstone of global warming alarmism, and so measuremennt of EEI has become a prime task for instrument technology, which does not seem to have delivered. The effect of EEI on surface temperature is unknown and impossible to measure and DLR is a misconception based on an incorrect form of Planck's Law.

ChatGPT on objective of CERES: 

CERES connects to global warming because it:

  • Measures the planetary energy balance directly at TOA.

  • Detects changes in OLR and OSR (reflected shortwave) caused by greenhouse gases, aerosols, clouds, and ice.

  • Provides the evidence that Earth is currently taking in more energy than it loses — the physical basis of global warming.

ChatGPT on objective of measuring DLR :

  • Provide a direct measure of the atmosphere’s infrared emission to the surface, essential for closing the surface energy budget, quantifying the greenhouse effect, tracking climate change, and validating models.
We read that the objective of CERES is to support global warming alarmism by measuring and reporting EEI attributed to CO2. But the objective is not reached, because (i) the accuracy of the measurement is not better than 1 W/m2, which is the expected size of EEI, and (ii) attribution to CO2 to is not credible because it is swamped by changes of cloud cover. We read that the objective of measuring DLR by a pyrgeometer is to quantify greenhouse effect. Both cases amounts to "chasing after wind" using "ghost detectors". 


torsdag 4 september 2025

Abstract vs Concrete vs Computational Physics

The science of physics has over time changed nature from concrete/real to abstract/non-real with the pillars of modern physics of Quantum Mechanics QM and General Relativity GR reaching breathtaking levels of abstraction during the first half of the 20th century culminating today as string theory in 11 space dimensions beyond any reality.  

Today with powerful computers available at no cost there is a reverse trend in the form of computation opening new capabilities of using theories of physics for practical purposes. Computation is a concrete process and computational physics starts with a concrete mathematical model and not with an abstraction.

Let us compare Newtonian mechanics in concrete and abstract formulation. 

The concrete form consists of Newton's Law $F=ma$ connecting force $F$ to mass $m$ and acceleration $a=\frac{dv}{dt}$ with $v$ velocity and $t$ time. The evolution over time of any mechanical system (without viscous forces) can be computationally simulated by time-stepping Newton's Law. Concrete and general.

The abstract form states that a mechanical system evolves from $t=0$ to $t=T$ so that:

  • The action $L(v)=\int_0^T(T-V)dt$ is stationary,  

where $T=m\frac{v^2}{2}$ is kinetic energy and $V$ is potential energy. The condition for stationarity in differential form then reads $m\frac{dv}{dt}=F$ with $F$ gradient of $V$, which is Newton's Law.

The difference between abstract and concrete is the same as characterising a local minimum of a function $f(x)$ over variation of a real variable $x$ for $x=\bar x$ as $f^\prime (\bar x) =0$ with $f^\prime =\frac{df}{dx}$. Minimisation is abstract in the sense that no computational method is implied other than comparing the value $f(x)$ for all $x$, which can take infinite work. On the other hand, there are many methods for computing a root to the equation $f^\prime (x)=0$. 

We thus see with that the concrete formulation directly opens to computational solution, while the abstract formulation does not. The pendulum thus may swing back from abstract to concrete in a 21st century filled with computation.

But we still live in the era of QM and GR, which are both abstract and uncomputable. QM is based on an abstract multi-dimensional Schrödinger equation without real physical meaning which is uncomputable because of its many dimensions. GR is based on Einstein's equation with a condensed abstract formulation which when written out for computation shows to be uncomputable. 

RealQM is a new form of quantum mechanics based on a concrete computable model. RealQM + Newton offers a unified concrete continuum model covering all scales which is computable. 

Ontology of physics (what is) is concrete, while epistemology of physics (what we can say) can be abstract. Computation can open ontology of physics to inspection and so feed epistemology of physics. Epistemology without ontology is empty.

onsdag 3 september 2025

Is Measuring Temperature at Distance Possible and Useful?

Climate alarmism of global warming claims to be supported by measurement of the energy balance of  Earth+atmosphere by instruments like pyrgeometers, bolometers and radiometers with an accuracy of at best 1-2 Watts/m2 compared to a total of around 240 W/m2 and a projected total imbalance of 4 W/m2 as "radiative forcing" from doubling of atmospheric CO2 corresponding to a warming of 1 K. 

The case for global warming may seem weak from these measurements, but nevertheless they serve to foster alarmism. 

To properly evaluate the measurements it is necessary to understand how these instruments are designed and how they operate. For a pyrgeometer or bolometer using a thermocouple as sensor, there are two fundamentally different views:

  1. A thermocouple essentially measures incoming radiance from a source as a process variable. 
  2. A thermocouple essentially measures a source temperature as a state variable.  
It is natural to make a comparison in terms of a bank account:
  1. Difference between deposits and withdrawals as process variable.
  2. Total savings as state variable.
We understand that total savings may be fairly stable, while deposits minus withdrawals can fluctuate quite a bit. The same for temperature vs radiance imbalance. 

What does then a thermocouple as sensor in fact measure? Radiance or temperature? 

1. There is a widely spread view that a thermocouple essentially measures radiance and so can be used to reliably measure both incoming and outgoing radiance for Earth+atmosphere and so determine imbalance, even if the accuracy is not better than 1-2 Watts/m2, and so detect global warming. Radiance is then measured through a calibration process confronting the sensor with sources of known temperature $T$ with radiance according to an assumed Planck-Stefan-Boltzmann Law of the form $\sigma T^4$.  

2. There is also a different view that a thermocouple essentially measures source temperature by essentially allowing the sensor to take on the source temperature by radiative equilibrium established optically at distance. In practice the radiative equilibrium source-sensor is only partially established by sensor cooling, but the principle of radiative equilibrium with equal temperature remains. 

Case 2 builds on a clear physical principle of radiative equilibrium in stable measurement of a state variable.

Case 1 is based on instrument calibration vs sources/blackbodies of known temperature $T$ assumed to give radiance input of $\sigma T^4$, while the true input is PSB in the form $\sigma (T^4-T_i^4)$, where $T_i$ is instrument base temperature which is not 0 in general. Case 1 is thus based on a calibration process using an incorrect PSB law inflating input radiance. Moreover the measurement concerns a process variable prone to instability. There are cryogenic sensors with very small $T_i$ and better precision. A proof of the correct PSB Law in classical terms without statistics is presented here and in this talk.

Case 1 is consensus and is used to support alarmism from measured radiance imbalance of Earth+atmosphere as if this is a fact. But the measurement precision barely can capture any imbalance from doubled CO2. Unfortunately many climate skeptics embrace the idea that a pyrgeometer measures massive incoming radiance (Downwelling/Upwelling/Outgoing Longwave Radiation) and so go along with a basic alarmist argument: The measured energy imbalance is the result of more CO2. 

A careful study shows that a thermocouple in fact measures source temperature as stable output, while derived radiance can be misleading because the calibration uses an incorrect PBS Law and is prone to instability. This means that measured energy imbalance can be questioned along with alarmism.

But the discussion is pretty much closed on 1 as the truth. Hopefully a new discussion can take place around the question: What does a thermocouple primarily measure and on what physical grounds? How can a thermometer acting at distance be constructed? Is an IR-camera such a thing?