torsdag 4 september 2025

Abstract vs Concrete vs Computational Physics

The science of physics has over time changed nature from concrete/real to abstract/non-real with the pillars of modern physics of Quantum Mechanics QM and General Relativity GR reaching breathtaking levels of abstraction during the first half of the 20th century culminating today as string theory in 11 space dimensions beyond any reality.  

Today with powerful computers available at no cost there is a reverse trend in the form of computation opening new capabilities of using theories of physics for practical purposes. Computation is a concrete process and computational physics starts with a concrete mathematical model and not with an abstraction.

Let us compare Newtonian mechanics in concrete and abstract formulation. 

The concrete form consists of Newton's Law $F=ma$ connecting force $F$ to mass $m$ and acceleration $a=\frac{dv}{dt}$ with $v$ velocity and $t$ time. The evolution over time of any mechanical system (without viscous forces) can be computationally simulated by time-stepping Newton's Law. Concrete and general.

The abstract form states that a mechanical system evolves from $t=0$ to $t=T$ so that:

  • The action $L(v)=\int_0^T(T-V)dt$ is stationary,  

where $T=m\frac{v^2}{2}$ is kinetic energy and $V$ is potential energy. The condition for stationarity in differential form then reads $m\frac{dv}{dt}=F$ with $F$ gradient of $V$, which is Newton's Law.

The difference between abstract and concrete is the same as characterising a local minimum of a function $f(x)$ over variation of a real variable $x$ for $x=\bar x$ as $f^\prime (\bar x) =0$ with $f^\prime =\frac{df}{dx}$. Minimisation is abstract in the sense that no computational method is implied other than comparing the value $f(x)$ for all $x$, which can take infinite work. On the other hand, there are many methods for computing a root to the equation $f^\prime (x)=0$. 

We thus see with that the concrete formulation directly opens to computational solution, while the abstract formulation does not. The pendulum thus may swing back from abstract to concrete in a 21st century filled with computation.

But we still live in the era of QM and GR, which are both abstract and uncomputable. QM is based on an abstract multi-dimensional Schrödinger equation without real physical meaning which is uncomputable because of its many dimensions. GR is based on Einstein's equation with a condensed abstract formulation which when written out for computation shows to be uncomputable. 

RealQM is a new form of quantum mechanics based on a concrete computable model. RealQM + Newton offers a unified concrete continuum model covering all scales which is computable. 

Ontology of physics (what is) is concrete, while epistemology of physics (what we can say) can be abstract. Computation can open ontology of physics to inspection and so feed epistemology of physics. Epistemology without ontology is empty.

onsdag 3 september 2025

Is Measuring Temperature at Distance Possible and Useful?

Climate alarmism of global warming claims to be supported by measurement of the energy balance of  Earth+atmosphere by instruments like pyrgeometers, bolometers and radiometers with an accuracy of at best 1-2 Watts/m2 compared to a total of around 240 W/m2 and a projected total imbalance of 4 W/m2 as "radiative forcing" from doubling of atmospheric CO2 corresponding to a warming of 1 K. 

The case for global warming may seem weak from these measurements, but nevertheless they serve to foster alarmism. 

To properly evaluate the measurements it is necessary to understand how these instruments are designed and how they operate. For a pyrgeometer or bolometer using a thermocouple as sensor, there are two fundamentally different views:

  1. A thermocouple essentially measures incoming radiance from a source as a process variable. 
  2. A thermocouple essentially measures a source temperature as a state variable.  
It is natural to make a comparison in terms of a bank account:
  1. Difference between deposits and withdrawals as process variable.
  2. Total savings as state variable.
We understand that total savings may be fairly stable, while deposits minus withdrawals can fluctuate quite a bit. The same for temperature vs radiance imbalance. 

What does then a thermocouple as sensor in fact measure? Radiance or temperature? 

1. There is a widely spread view that a thermocouple essentially measures radiance and so can be used to reliably measure both incoming and outgoing radiance for Earth+atmosphere and so determine imbalance, even if the accuracy is not better than 1-2 Watts/m2, and so detect global warming. Radiance is then measured through a calibration process confronting the sensor with sources of known temperature $T$ with radiance according to an assumed Planck-Stefan-Boltzmann Law of the form $\sigma T^4$.  

2. There is also a different view that a thermocouple essentially measures source temperature by essentially allowing the sensor to take on the source temperature by radiative equilibrium established optically at distance. In practice the radiative equilibrium source-sensor is only partially established by sensor cooling, but the principle of radiative equilibrium with equal temperature remains. 

Case 2 builds on a clear physical principle of radiative equilibrium in stable measurement of a state variable.

Case 1 is based on instrument calibration vs sources/blackbodies of known temperature $T$ assumed to give radiance input of $\sigma T^4$, while the true input is PSB in the form $\sigma (T^4-T_i^4)$, where $T_i$ is instrument base temperature which is not 0 in general. Case 1 is thus based on a calibration process using an incorrect PSB law inflating input radiance. Moreover the measurement concerns a process variable prone to instability. There are cryogenic sensors with very small $T_i$ and better precision. A proof of the correct PSB Law in classical terms without statistics is presented here and in this talk.

Case 1 is consensus and is used to support alarmism from measured radiance imbalance of Earth+atmosphere as if this is a fact. But the measurement precision barely can capture any imbalance from doubled CO2. Unfortunately many climate skeptics embrace the idea that a pyrgeometer measures massive incoming radiance (Downwelling/Upwelling/Outgoing Longwave Radiation) and so go along with a basic alarmist argument: The measured energy imbalance is the result of more CO2. 

A careful study shows that a thermocouple in fact measures source temperature as stable output, while derived radiance can be misleading because the calibration uses an incorrect PBS Law and is prone to instability. This means that measured energy imbalance can be questioned along with alarmism.

But the discussion is pretty much closed on 1 as the truth. Hopefully a new discussion can take place around the question: What does a thermocouple primarily measure and on what physical grounds? How can a thermometer acting at distance be constructed? Is an IR-camera such a thing?