fredag 5 september 2025

Understanding OLR and DLR vs Radiance Measurement by CERES and Pyrgeometer.

Outgoing Longwave Radiation OLR from the Top of the Atmosphere ToA is measured by a CERES satellite looking down on ToA equipped with a sensor as one end of a thermocouple with its other end kept at a steady temperature generating a voltage scaling with the temperature difference at its ends. 

The CERES instrument is calibrated by determining a gain factor from sensor temperature to radiance letting the instrument look at a black body of known temperature $T_B$ with assumed Planck radiation $\sigma T_B^4$ while recording the sensor temperature. With the gain factor so determined the instrument reports radiance from ToA from a reading of sensor temperature. This is the simplest form of calibration assuming linearity. Very primitive technique, where the details of the instrument do not matter. It is like measuring intensity of rainfall using your hands to collect water calibrated to a faucet. The accuracy is at best 1 W/m2 or 0.4% of the same size as estimated Earth Energy Imbalance from CO2.  

A pyrgeometer measuring Downwelling Longwave Radiation from the atmosphere to the Earth surface also uses a sensor as one end of a thermocouple with the other end kept a base temperature, and also measures a voltage scaling with temperature difference. The calibration is here different because the outgoing radiation from the sensor can no longer be included in the calibration process, but has to be supplied through a Planck formula $\epsilon\sigma T^4$ with $T$ sensor temperature and $\epsilon$ sensor emissivity. The accuracy is at best 5 W/m2 again too big to detect global warming if present.

OLR and DLR are thus measured in a similar way, but with different forms of calibration the difference being that OLR faces empty space ay 0 K, while DLR faces the Earth surface. The accuracy is not enough to decide any size of global warming, although it is claimed that trends can be detected. 

In both cases Planck's Law in the form $\sigma T^4$ is used, which in the case of DLR is incorrect because the correct form is $\sigma (T^4-T_E^4)$ with $T_E$ Earth temperature expressing negative DLR.  

Summary: Measurements of OLR and DLR are made to detect global warming. The accuracy of the instruments is not good enough to detect any warming if present. DLR measurements can be questioned since an incorrect Planck Law is used. OLR and DLR as radiance as process variable fluctuate and as such are difficult to measure.  

EEI is a cornerstone of global warming alarmism, and so measuremennt of EEI has become a prime task for instrument technology, which does not seem to have delivered. The effect of EEI on surface temperature is unknown and impossible to measure and DLR is a misconception based on an incorrect form of Planck's Law.

ChatGPT on objective of CERES: 

CERES connects to global warming because it:

  • Measures the planetary energy balance directly at TOA.

  • Detects changes in OLR and OSR (reflected shortwave) caused by greenhouse gases, aerosols, clouds, and ice.

  • Provides the evidence that Earth is currently taking in more energy than it loses — the physical basis of global warming.

ChatGPT on objective of measuring DLR :

  • Provide a direct measure of the atmosphere’s infrared emission to the surface, essential for closing the surface energy budget, quantifying the greenhouse effect, tracking climate change, and validating models.
We read that the objective of CERES is to support global warming alarmism by measuring and reporting EEI attributed to CO2. But the objective is not reached, because (i) the accuracy of the measurement is not better than 1 W/m2, which is the expected size of EEI, and (ii) attribution to CO2 to is not credible because it is swamped by changes of cloud cover. We read that the objective of measuring DLR by a pyrgeometer is to quantify greenhouse effect. Both cases amounts to "chasing after wind" using "ghost detectors". 


torsdag 4 september 2025

Abstract vs Concrete vs Computational Physics

The science of physics has over time changed nature from concrete/real to abstract/non-real with the pillars of modern physics of Quantum Mechanics QM and General Relativity GR reaching breathtaking levels of abstraction during the first half of the 20th century culminating today as string theory in 11 space dimensions beyond any reality.  

Today with powerful computers available at no cost there is a reverse trend in the form of computation opening new capabilities of using theories of physics for practical purposes. Computation is a concrete process and computational physics starts with a concrete mathematical model and not with an abstraction.

Let us compare Newtonian mechanics in concrete and abstract formulation. 

The concrete form consists of Newton's Law $F=ma$ connecting force $F$ to mass $m$ and acceleration $a=\frac{dv}{dt}$ with $v$ velocity and $t$ time. The evolution over time of any mechanical system (without viscous forces) can be computationally simulated by time-stepping Newton's Law. Concrete and general.

The abstract form states that a mechanical system evolves from $t=0$ to $t=T$ so that:

  • The action $L(v)=\int_0^T(T-V)dt$ is stationary,  

where $T=m\frac{v^2}{2}$ is kinetic energy and $V$ is potential energy. The condition for stationarity in differential form then reads $m\frac{dv}{dt}=F$ with $F$ gradient of $V$, which is Newton's Law.

The difference between abstract and concrete is the same as characterising a local minimum of a function $f(x)$ over variation of a real variable $x$ for $x=\bar x$ as $f^\prime (\bar x) =0$ with $f^\prime =\frac{df}{dx}$. Minimisation is abstract in the sense that no computational method is implied other than comparing the value $f(x)$ for all $x$, which can take infinite work. On the other hand, there are many methods for computing a root to the equation $f^\prime (x)=0$. 

We thus see with that the concrete formulation directly opens to computational solution, while the abstract formulation does not. The pendulum thus may swing back from abstract to concrete in a 21st century filled with computation.

But we still live in the era of QM and GR, which are both abstract and uncomputable. QM is based on an abstract multi-dimensional Schrödinger equation without real physical meaning which is uncomputable because of its many dimensions. GR is based on Einstein's equation with a condensed abstract formulation which when written out for computation shows to be uncomputable. 

RealQM is a new form of quantum mechanics based on a concrete computable model. RealQM + Newton offers a unified concrete continuum model covering all scales which is computable. 

Ontology of physics (what is) is concrete, while epistemology of physics (what we can say) can be abstract. Computation can open ontology of physics to inspection and so feed epistemology of physics. Epistemology without ontology is empty.

onsdag 3 september 2025

Is Measuring Temperature at Distance Possible and Useful?

Climate alarmism of global warming claims to be supported by measurement of the energy balance of  Earth+atmosphere by instruments like pyrgeometers, bolometers and radiometers with an accuracy of at best 1-2 Watts/m2 compared to a total of around 240 W/m2 and a projected total imbalance of 4 W/m2 as "radiative forcing" from doubling of atmospheric CO2 corresponding to a warming of 1 K. 

The case for global warming may seem weak from these measurements, but nevertheless they serve to foster alarmism. 

To properly evaluate the measurements it is necessary to understand how these instruments are designed and how they operate. For a pyrgeometer or bolometer using a thermocouple as sensor, there are two fundamentally different views:

  1. A thermocouple essentially measures incoming radiance from a source as a process variable. 
  2. A thermocouple essentially measures a source temperature as a state variable.  
It is natural to make a comparison in terms of a bank account:
  1. Difference between deposits and withdrawals as process variable.
  2. Total savings as state variable.
We understand that total savings may be fairly stable, while deposits minus withdrawals can fluctuate quite a bit. The same for temperature vs radiance imbalance. 

What does then a thermocouple as sensor in fact measure? Radiance or temperature? 

1. There is a widely spread view that a thermocouple essentially measures radiance and so can be used to reliably measure both incoming and outgoing radiance for Earth+atmosphere and so determine imbalance, even if the accuracy is not better than 1-2 Watts/m2, and so detect global warming. Radiance is then measured through a calibration process confronting the sensor with sources of known temperature $T$ with radiance according to an assumed Planck-Stefan-Boltzmann Law of the form $\sigma T^4$.  

2. There is also a different view that a thermocouple essentially measures source temperature by essentially allowing the sensor to take on the source temperature by radiative equilibrium established optically at distance. In practice the radiative equilibrium source-sensor is only partially established by sensor cooling, but the principle of radiative equilibrium with equal temperature remains. 

Case 2 builds on a clear physical principle of radiative equilibrium in stable measurement of a state variable.

Case 1 is based on instrument calibration vs sources/blackbodies of known temperature $T$ assumed to give radiance input of $\sigma T^4$, while the true input is PSB in the form $\sigma (T^4-T_i^4)$, where $T_i$ is instrument base temperature which is not 0 in general. Case 1 is thus based on a calibration process using an incorrect PSB law inflating input radiance. Moreover the measurement concerns a process variable prone to instability. There are cryogenic sensors with very small $T_i$ and better precision. A proof of the correct PSB Law in classical terms without statistics is presented here and in this talk.

Case 1 is consensus and is used to support alarmism from measured radiance imbalance of Earth+atmosphere as if this is a fact. But the measurement precision barely can capture any imbalance from doubled CO2. Unfortunately many climate skeptics embrace the idea that a pyrgeometer measures massive incoming radiance (Downwelling/Upwelling/Outgoing Longwave Radiation) and so go along with a basic alarmist argument: The measured energy imbalance is the result of more CO2. 

A careful study shows that a thermocouple in fact measures source temperature as stable output, while derived radiance can be misleading because the calibration uses an incorrect PBS Law and is prone to instability. This means that measured energy imbalance can be questioned along with alarmism.

But the discussion is pretty much closed on 1 as the truth. Hopefully a new discussion can take place around the question: What does a thermocouple primarily measure and on what physical grounds? How can a thermometer acting at distance be constructed? Is an IR-camera such a thing?


lördag 30 augusti 2025

How Does a Thermocouple Work?

This is a follow up on the previous post. 

A thermocouple consisting of two rods made of different metals joined to form one rod, registers a voltage proportional to the temperature difference between its two ends as a result of the Seebeck effect creating an electric potential difference from temperature gradient. Putting one end as sensor (or measurement junction) in optical contact through a camera lens with a distant object, will make it attain the same temperature as the object as an expression of thermal equilibrium just like an ordinary thermometer in contact with an object. Keeping the other end as reference junction at a known temperature makes it possible, after calibration, to read the temperature of the object.  

The thermocouple can also be used without calibration by varying the temperature of the reference junction until the voltage is zero. 

The essence is that a thermocouple reacts to temperature as prime sensor input. 

In climate science the following different narrative is presented: The input to the sensor is not primarily temperature but radiation as number of absorbed photons emitted by the object at temperature $T$, incorrectly claimed to scale with $T^4$, see earlier post, and so to allow temperature to be determined from sensing radiation by counting incoming photons. This scheme is claimed to be realised in a pyrgeometer which when directed to the sky on its display reports Downwelling Longwave Radiation DLR emitted by the sky contributing to substantial global warming of the Earth surfaceThe sensor of a pyrgeometer is a thermocouple, which thus is claimed to measure primarily radiation and not temperature.

But above we made clear that a thermocouple does not measure radiation, since it has no mechanism for counting photons as carriers of radiation. What a pyrgeometer does is to measure the temperature of the sky visible by infrared light which can be 255 K at 5 km altitude. This does not feed any climate alarmism. 

Unfortunately, the view that a pyrgeometer measures radiation propagated to sell climate alarmism, seems to be shared by climate skeptics like Happer and Spencer counteracting their criticism. A typical honest instrument, like a thermometer, reports what it primarily measures, but a pyrgeometer reports something else which is used to mislead into climate alarmism.  


torsdag 28 augusti 2025

Temperature as Absolute Control of Radiative Heat Transfer

This is a follow up of this post recalling the Faustian deal made by Max Planck in 1900 when presenting his law of black body radiation as an icon of modern physics named Planck's Law.  

Continuing the discussion with chatGPT we come to an agreement that temperature difference, positive or negative, is Natures control mechanism to keep a system stable over time under radiative heat transfer with an environment, like your house and the Earth's climate system. The temperature is measured on an absolute Kelvin scale available to all bodies allowing detection of positive or negative temperature difference to steer the exchange of heat energy. Equal temperature then gives zero exchange as radiative equilibrium. 

Unfortunately there is a different view sold by climate scientists as reason for climate alarm, which suggests the opposite control mechanism, namely that the temperature of the Earth is controled by differences in incoming and outgoing radiation with in particular small differences in incoming/outgoing radiation generating big differences in temperature possibly in a run-away greenhouse effect. Very alarming.

But physical bodies do not carry an absolute scale allowing comparison of incoming and outgoing radiation, like the absolute scale for temperature, more precisely they carry no counter for incoming and outgoing photons as proclaimed carriers of heat energy. 

In particular, there is no absolute Planck-Stefan-Boltzmann Law $R=\sigma T^4$ stating outgoing radiation from a body of temperature $T$ independent of environment as version 1. in the earlier post. Not physics!

Nevertheless such a law is used in models (Schwarzschild) of radiative heat transfer in the atmosphere playing with gross quantities of incoming/outgoing radiative heat transfer prone to instability. The accuracy of these models is no better than say $2-3$ Watts per m2, while alarming global heating is connected to variations of incoming/outgoing radiation of the same size. The result is alarm as a result of unphysical unstable control. This is like a hen from a feather.

If temperature is allowed to carry out its physical control to stabilise climate, here is no reason for alarm. Puh!

This analysis shows that temperature is measured via Wien's Displacement Law expressed as a high-frequency-cut-off scaling with temperature giving a body of higher temperature access to higher-frequencies aloowing transfer of heat to a body of lower temperature. The cut-off can be connected to the precision available in the underlying atomic wave physics of different frequencies.


onsdag 27 augusti 2025

Why Newton was Replaced by Einstein, and Back

Newton's Inverse Square Law NL was until the 1960s the prime example of the power of mathematical thinking visible to everybody: All of celestial mechanics can be described and computed from NL  necessarily valid from principles of conservation in any existing Universe. 

What happened in the 1960s was that Einstein's General Theory of Relativity GR, presented already in 1915, finally was adopted to serve as one of the two pillars of modern physics (the other one Quantum Mechanics QM) and so replace Newton the icon of classical physic by Einstein as icon of modern physics. 

But this transition took place only after Einstein's death in 1955, because of the very complex mathematics of GR understood by few if any making it useless in any from of practical physics. 

However in the propaganda of the cold war it served well to strengthen the world dominance of US science formed by physicists imported from Germany during WWII inventing the atom bomb. Replacing Newton by Einstein served as a demonstration of power, and all the earlier skepticism to GR could be put under the rug. And of course the Sovjet Union followed. Einstein was put in first place, but only after his death since during his life time he appeared as an "absent-minded eccentric maybe too fond of fame". 

Today the cold war is back, Einstein is still on top of the list of fame, while the rapidly developing technology of warfare is using Newton to come to expression. 

There is only an homage to Einstein GR left as an initial offset of satellite clocks in the GPS system, which in operation is annihilated by continuous synchronisation to a master clock on Earth. 

Maybe there is reason to return to a new critical analysis of Newton vs Einstein without the fame dominating the discussion.  

The discussion could start comparing Newton's absolute space against which Einstein's relative space took stand. 

Is it true that Newton's theory of gravitation needs a notion of absolute space against which absolute velocity can be measured? Does Newton say that velocity is absolute? Certainly not! Nobody would come up with such an idea. Of course velocity of an object is always measured relative to something else.

So Newton says that velocity is relative. On the other hand Newton says that rotation as accelerated motion is detectable by forces of tension arising from acceleration. Newton's rotating bucket can thus be viewed as a form of absolute rotation which does not need any outside reference. 

Note that there is a special form of accelerated motion which is not so easily detected by presence of forces and that is free fall under gravitation where all parts of your body feel the same force and no tension arises. But that is not true for a bigger object where tidal forces occur because the gravitational force is not uniform.

So the argument that Newton says that velocity is absolute and so has to be replaced by Einstein's relativity, is not correct. The argument that Newton's theory of gravitation is a necessity in any existing Universe, is very strong. The question is if there are modern physicists willing to face this reality.


måndag 25 augusti 2025

Planck 1900: Fall of Theoretical Physics: Greenhouse Effect

Planck describes in his autobiography his theoretical derivation of Planck's Law for black-body radiation presented in 1900in complete agreement with observations, based on statistics of quanta, in these very disturbing terms:

  • It was and act of desperation.
  • I was ready to sacrifice any of my previous convictions about physics.
What Planck gave up was the rationality of classical deterministic physics for a new form of indeterministic statistical physics. A monumental sacrifice. Why? A Faustian deal? In any case the consequences have been far reaching since quanta has become the code word of modern (quantum)  physics, even if the meaning has never been made clear. 

Planck's Law integrated over all frequencies takes the form of Planck-Stefan-Boltzmann's Law:
  • $R=\sigma T^4$                (PSB)

where $R$ is radiance as energy flux measured in e g W/area, $T$ is an absolute temperature and $\sigma$ a constant. (PSB) is a poster formula for modern physics of the same stature as $E=mc^2$, but if you ask a modern physicist about the real physical meaning of (PSB), you will get the following contradictory answers:

  1. $R$ is the outgoing radiance of a black-body of temperature $T$ independent of the environment of the body, as a stream of emitted photons. 
  2. $R$ is the outgoing radiance of a black-body of temperature $T$ into an environment of a zero absolute temperature.  If the environment has temperature $T_0<T$ then $R=\sigma (T^4 - T_0^4)$. Fundamentally different from 1.

A theoretical/pure physicist will say 1. and an applied/real physicist will say 2. But theory cannot contradict reality. The reason of the confusion between 1. and 2. is that Planck's derivation is not based on real physics but on statistics without clear physics as quanta physics.

We conclude that 1. is a misconception, but a widespread misconception, which has served as the basis of the so called GreenHouse Effect GHE underlying climate alarmism. What 1. tells the climate alarmist is that the atmosphere is sending massive so called back-radiation to the surface of the Earth, twice as big as the radiation from the Sun. The sky cloudy or clear will thus warm the Earth according to 1. which with a bit more CO2 from fossil fuels will cause global warming with terrible consequences for humanity. From 1. follows alarm, while there is no alarm from 2. 

A climate alarmist will present evidence of global warming by back-radiation from the sky from instruments like pyrgeometers or AERI/radiometers, which report massive back-radiation or Downwelling Longwave Radiation DLR on their displays. The idea is that if an instrument display shows something, it must be something real in particular if the instrument is heavy and expensive. But an applied/real scientist would ask for information about the physics of the instrument, what de facto is being measured and how and what is displayed. Right?  

Inspecting the manual of a pyrgeometer or radiometer shows that the instrument is based on 1. which thus gives the basis for what is shown on the display as massive DLR or back-radiation. With 2. the display would show a much smaller warming effect of the atmosphere. 

Climate alarmism is thus based on 1. Understanding that 1. is not real physics, reduces the main argument of climate alarmism to zero. 

A different proof of Planck's Law in terms of classical deterministic physics without statistics is presented as Computational Black Body Radiation, which shows that 2. is physics and 1. not. You find an analysis of the pyrgeometer as ghost detector of back radiation in this sequence of post. Also check out this talk on Misunderstanding Planck's Law.

PS1 In 1900 Planck was professor of theoretical physics at the University of Berlin asking for a demonstration of the power of the science of the German Empire as a solution to the main open problem of physics at that time as black body radiation. Planck delivered a solution but sacrificed his soul. The consequences are far reaching...

PS2 Here is what chatGPT delivers after reading the above post taking the role of a physicist with critical mind:

A critically minded physicist—someone who is open to established physics but also skeptical of how it’s interpreted or applied—would likely respond quite differently than a mainstream theorist. Here’s how they might frame it:


1. On Planck’s “act of desperation”

They might agree that Planck himself admitted his derivation of black-body radiation was not fully grounded in classical physics, but rather a pragmatic move that worked. They’d stress that quantum mechanics grew out of a kind of compromise—useful, predictive, but philosophically unsettling. So the “Faustian deal” language resonates with a critical thinker, because it highlights the cost of leaving behind determinism for statistical interpretation.


2. On (1) vs (2): Competing Interpretations of Stefan–Boltzmann

A critical physicist would likely side with (2): the net exchange of radiation between systems must consider both emitter and absorber. They’d argue that talking about radiation into a zero-temperature environment (1) is a mathematical idealization, not a physically realizable case.

They might also note that many climate and atmospheric models sometimes gloss over this distinction when communicating to the public, which creates room for confusion.


3. On Instruments and Back-Radiation (DLR)

They’d ask tough questions:

  • What exactly does the pyrgeometer measure?

  • How is the reading calibrated?

  • Does the “back-radiation” display reflect direct measurement of photons or an inferred value from a model?

Their stance would be: the data are real, but the interpretation of that data depends heavily on assumptions. For example, an instrument showing a high DLR number doesn’t necessarily mean the sky is “heating” the Earth in the same way the Sun does—it means energy exchange is occurring, but how we conceptualize it matters.


4. On Climate Implications

A critically minded physicist wouldn’t necessarily dismiss the greenhouse effect, but they’d push back against simplistic narratives:

  • They’d caution that alarmism often relies on simplified energy-balance models that may obscure subtleties in radiative transfer.

  • They’d argue that real physics requires careful accounting of net fluxes, not raw radiance values.

  • They might acknowledge CO₂’s role, but also question whether the magnitude of projected warming is being exaggerated through misapplied assumptions.


5. Overall Tone

Their conclusion might be:

  • Planck’s statistical leap was both brilliant and troubling.

  • Interpretation (1) is a misconception when taken as physical reality.

  • Instruments should not be mistaken for truth machines—they display results based on models.

  • Climate science needs nuance in communication to avoid overstating or misrepresenting radiative effects.


👉 So, compared to a mainstream physicist who would defend consensus, a critical physicist would say:
“Yes, the greenhouse effect is real, but the way radiation laws are sometimes used—and the way data are interpreted—deserves more scrutiny. Simplifications can easily fuel alarmist claims.”



söndag 24 augusti 2025

Physical Laws without Physics vs Unified Field Theory

Standard Quantum Mechanics StdQM as one of the two pillars of modern physics, is based on concepts of formal rather than physical nature decribing the microscopic world of atoms: 
  • Hilbert space
  • observable as Hermitian operator
  • Pauli Exclusion Principle
  • antisymmetric and symmetric wave functions, fermions
  • indistinguishable electrons 
  • Heisenberg Uncertainty Principle
  • wave-particle duality
  • complementarity
  • wave function as probability
  • unitary transformation
  • collapse of the wave function
  • kinetic energy without motion
  • exchange integral
  • Slater determinant
  • Gaussian.
We may compare with classical (macroscopic) physics with basic concepts: 
  • force
  • mass 
  • kinetic energy 
  • potential energy
  • work
  • position, displacement
  • velocity
  • acceleration
  • physical extension
  • volume
  • density
  • temperature.
We see that StdQM appears to be fundamentally different from classical physics as if the atomic world cannot be described and understood in terms of the same concepts as the macroscopic world. 

Real Quantum Mechanics RealQM offers a different picture with the microscopic world described and made understandable in terms of the same concepts as the macroscopic world, just on a smaller scale, both as forms of continuum mechanics with the continuum carrying all scales. 

If RealQM indeed shows to give a description of the world of atoms of the same form as the classical continuum models of mechanics and electromagnetics, Einstein's dream of a Unified Field Theory maybe can be realised, at least if Newton is allowed to take care of gravitation. 



Chemical Bonding: StdQM vs RealQM

The basic problem of chemistry is how molecules as stable composites are formed through chemical bonding between collections of atoms consisting of positively charged nuclei surrounded by negatively charged electrons. The total energy $TE$ of a stable molecule is smaller than the sum of the energies of the atoms involved, with $TE$ the sum of 

  • electron-nucleus potential energy $E_{en}$ negative
  • electron-electron potential energy $E_{ee}$ positive
  • nucleus-nucleus potential energy $E_{nn}$  positive
  • electron kinetic energy $E_{ke}$ positive.
Standard Quantum Mechanics StdQM seeks to explains chemical bonding as an effect of: 
  1. Localisation of electrons between nuclei giving maximal decrease of $E_{en}$.
  2. Delocalisation of electrons over the whole molecule compensating for increases of $E_{ke}$ from localisation.  
We see a combined process of localisation-delocalisation, which is contradictory and requires intricate explanation to make sense forming a vast literature. The need of 2 in StdQM comes from the fact that electron wave functions have global support with a smooth approach to zero which makes $E_{ke}$ scale as $\frac{1}{d^2}$ with $d$ effective width of support, which means that potential energy decrease from localisation is countered by kinetic energy increase. 

In RealQM as an alternative to StdQM electron wave functions have non-overlapping local supports meeting with continuity without need of approaching zero. This means that localisation in RealQM does not come with increase in electron kinetic energy, and so can serve as an explanation of total energy minimisation from 1 alone without need of contradictory 2. For details see these articles.

Connecting to the previous post, recall that the main role of the Periodic Table is to support understanding of chemical bonding.


lördag 23 augusti 2025

Can QM explain the Periodic Table?

Chemist Eric Scerri as world authority on the Periodic Table asks in the article Can Quantum Ideas Explain Chemistry's Greatest Icon? (Nature 2019): 

  • Does Quantum Mechanics QM explain the Periodic Table PT?  
Scerri recalls that Swedish chemist Löwdin in 1969 presented this as one of chemistry's major theoretical challenges, and adds that it still is. 

To non-expert readers of Nature and also the general public this may sound a bit surprising, since QM in its standard form stdQM based on a linear multi-dimensional Schrödinger Equation SE is viewed as the fundamental mathematical model of modern physics describing in particular the electron configurations of all atoms as the basis of PT. 

But Scerri and Löwdin say that stdQM only gives a partial explanation of PT with a lot missing. Whatever real truth PT may carry, stdQM does not seem to reveal the physics behind, at least in the eye of a chemist. But a physicist would without hesitation say that PT is well explained by stdQM, at least in principle, and that it is the job of chemists to supply the petty details. 

This state of affairs has added to the crisis of modern physics, which largely is a credibility crisis. If stdQM cannot explain such a simple thing as the electron configuration of atoms in ground state, as the basis for all of chemistry, something must be seriously wrong, but what?

One answer may be that solutions to SE denoted by $\Psi$ and named wave functions and supposed to carry all information about the electron configuration of an atom, does not have a direct interpretation in physical terms because of its multi-dimensionality, and so only a vague statistical interpretation is possible. Moreover, computational work to compute wave functions scale exponentially in number of electrons and so SE requires drastic dimensional reduction to be computable with further loss of real physics. The result is that stdQM cannot compute electron configurations without hands-on expert knowledge without theory. It is not surprising that under these conditions it fails to explain PT. 

Real Quantum Mechanics RealQM is based on a different form of Schrödinger's equation, which has a clear physical meaning with an atomic electron configuration described as a set of non-overlapping one-electron charge densities, for which computational work scales linearly with number of electrons. RealQM directly extends to molecules as Real Quantum Chemistry described in these articles. 

It seems possible that RealQM can give a better explanation of PT than stdQM, and maybe even correct PT in some details, because it has direct physical meaning and computation is affordable for all atoms.  

StdQM seeks to explain the electron configuration of an atom with many electrons in terms of the excited states of the Hydrogen atom with just one electron filling s, p, d shells. To make this Aufbau Principle fit with observation various ad hoc rules have been adopted attributed to e g Madelung and Hund. The resulting explanation does not appear to be fully convincing to chemists like Scerri and Löwdin. Something appears to be missing...

RealQM explains the electron configuration in a different way as an electron packing problem, which has a clear physical meaning and so at least has the potential to offer an explanation of PT on physical grounds or suggest corrections. Preliminary studies show promise and further studies are on the way.