fredag 26 april 2024

Primordial Gravitational and Electric/Magnetic Potentials

Dialog between the Two Greatest World Systems with primordial potentials vs densities.  

This is a further remark to previous posts on New Newtonian Cosmology with a gravitational potential $\phi_m (x,t)$ and electric potential $\phi_c(x,t)$ with $x$ a Euclidean space coordinate and $t$ a time coordinate, viewed as primordial with mass density $\rho_m (x,t)$ and electric charge density $\rho_c(x,t)$ given by 

  • $\rho_m=\Delta\phi_m$      (1)
  • $\rho_c=\Delta\phi_c$      (2)
Here $\rho_m \ge 0$ while $\rho_c$ can be both positive and negative, and $\Delta$ is the second order Laplacian differential operator. 

The corresponding gravitational force $f_m\sim -\nabla\phi$ is attractive between positive mass densities and the corresponding Coulomb force $f_c\sim \nabla\phi_c$ is attractive between charge densities of opposite sign and repulsive for charge densities of the same sign. 

In principle $\rho_m<0$ is possible in (1), with then repulsion between mass densities of different sign which would separate large scales into Universa with positive and negative mass, where we happen to live in one with mass positive. It is thinkable that presence of negative mass density shows up as dark energy. It is thinkable that a very smooth $\Delta\phi_m$ corresponds to dark matter.  

The gravitational force $f_m$ acts on large masses at large distances. The electric Coulomb force $f_c$ acts on small small charges at small distances, which requires physics preventing charges of different sign to come too close, which is represented by the presence of the Laplacian in Schrödinger's equation. 

Including also a magnetic potential connected to the electric potential by Maxwell's equations and Newton's 2nd Law for mass motion subject to force, gives a model including Newton's mechanics, electromagnetics and gravitation, with potentials as primordial quantities from which mass and charge densities and forces are derived. Here Real Quantum Mechanics naturally fits in as a classical 3d continuum mechanics model. 

An important aspect of (1) and (2) is that $\rho_m$ and $\rho_c$ are derived by differentiation as an operation acting locally in space, which can be perceived to act instantly in time,  thus avoiding the hard-to-explain instant-action-at-distance coming with the standard view with mass and charge densities as primordial. 

The absence of magnetic monopoles corresponding to point charges makes magnetics different from electrics in the formation of electromagnetics.  

 

torsdag 25 april 2024

Temperature as Quality Measure of Energy.

In ideal gas dynamics temperature appears as an intensive variable $T$ connected to internal energy $e$ and density $\rho$ by 

  • $T=\frac{e}{\rho}$                          
with a corresponding pressure law 
  • $p=\gamma e$
where $\gamma$ is a gas constant. Internal energy is viewed as small scale kinetic energy from small scale molecular motion. Internal energy can transformed into mechanical work in expansion, which without external forcing (or gravitation) is an irreversible process.  

For a solid body viewed as a vibrating atomic lattice temperature scales with total internal energy as the sum of small scale kinetic energy and potential energy, which can be transferred by radiation and conduction to a body of lower temperature.   

In both cases temperature appears as a quality measure of internal energy as an intensive variable. 

The maximal efficiency of a Carnot heat engine transforming heat energy into work operating between two temperatures $T_{hot}>T_{cold}$ is equal to $1-\frac{T_{cold}}{T_{hot}}$. 

Radiative heat transfer form a hot body of temperature $T_{hot}$ to a cold body of temperature $T_{cold}$, scales with $(T_{hot}^4-T_{cold}^4)$ according to Stephan-Boltzmann-Planck. 

Conductive heat transfer scales with $(T_{hot}-T_{cold})$ according to Fourier.

In both cases the heat transfer from hot to cold can be seen as transformation from high quality energy into low quality energy in an irreversible process in conformity with the 2nd Law of Thermodynamics. 

The Nobel Prize in Physics in 2008 was awarded to experimental detection of Cosmic Microwave Background CMB radiation with perfect Planck spectrum as an after-glow of a Bing Bang with temperature of  2.725 K and corresponding very low quality energy.  

With radiation scaling with $T^4$ the difference between 300 K as global temperature and 3 K as deep space CMB comes out with a factor of $10^{-8}$. The contribution to global warming from CMB thus appears to be very small. 

We see from $e=\rho T$ that low density and low temperature both connect to low energy quality making both wind and solar energy inefficient compared to fossil and nuclear energy.    


Cosmic Microwave Background Radiation vs Cosmic Inflation

Cosmic Microwave Background Radiation CMB is supposed to be an afterglow of a Big Bang which started with Cosmic Inflation as a theory proposed by theoretical physicist Alan Guth as an extremely rapid expansion from a Universe of the size of a proton to the size of a pea taking place during a period of time from $10^{-36}$ to $10^{-32}$ seconds after zero time with an expansion factor of $10^{13}$.  

A common view is that Alan Guth's theory solves all three main problems of cosmology: the horizon problem, the flatness problem and the magnetic monopole problem. 

The Nobel Prize in Physics 2023 was awarded for experimental methods that generate attosecond pulses of light for the study of electron dynamics in matter, with an attosecond = $10^{-18}$ second. 

Visible light has time scale of $10^{-15}$, x-rays $10^{-18}$ and $\gamma$-rays $10^{-22}$ seconds as real physics with highest frequency presently known. Frequency is connected to energy through Planck's Law which allows determining the frequency of $\gamma$-rays by measuring the energy of $\gamma$-radiation.  

Cosmic Inflation is described follows in popular form:

  • According to the theory, for less than a millionth of a trillionth of a trillionth of a second after the universe's birth, an exotic form of matter exerted a counterintuitive force: gravitational repulsion. 
  • Guth says the existence of this material was reasonably likely.
  • Guth says that we don’t necessarily expect to answer those questions next year, but anything that makes small steps towards understanding the answers is thrilling.
If you feel that you need more information to be able to judge if Cosmic Inflation is a hoax, you can consult the following book by Guth:  The Inflationary Universe: The Quest For A New Theory Of Cosmic Origins.

The present inflation in Sweden of 10% appears to be pretty small when compared to cosmic inflation.


onsdag 24 april 2024

How to Measure Temperature

Measuring temperature accurately is a delicate procedure.

This is a comment to the discussion in recent posts of the proclaimed perfect blackbody spectrum of Cosmic Microwave Background CMB radiation with temperature 2.725 K.  

You can measure your body temperature by body contact with a quicksilver thermometer or at distance by an infrared thermometer. Both work on a principle of thermal equilibrium between source and thermometer sensor as a stable state over time. Your body is assigned the temperature recorded by the thermometer. 

Temperature can be seen as a measure of energy in the form of heat energy or vibrational energy of a vibrating system like an atomic lattice as the generator of radiation as radiative heat transfer.

Computational Blackbody Radiation offers a new analysis of radiative heat transfer using classical wave mechanics as a deterministic form of Planck's analysis based on statistics of quanta. The basic element of the analysis is a radiation spectrum from a vibrating atomic lattice: 

  • $E(\nu ,T)=\gamma T\nu^2$ for $\nu \le \frac{T}{h}$        (1a)
  • $E(\nu ,T)= 0$ for $\nu >\frac{T}{h}$                               (1b)
where $\nu$ is frequency on an absolute time scale, $T$ is temperature on a lattice specific energy scale, $\gamma$ and $h$ are lattice specific parameters and $\frac{T}{h}$ is a corresponding high-frequency cut-off frequency setting a upper limit to frequencies being radiated. Here a common temperature $T$ for all frequencies expresses thermal equilibrium between frequencies. 

It is natural to define a blackbody BB to have radiation spectrum of the form (1) with maximal $\gamma$ and high-frequency cut-off and to use this as a universal thermometer measuring the temperature of different bodies by thermal equilibrium. 

Consider then a vibrating atomic lattice A with spectrum according (1)-(2) with different parameters $\bar\gamma <\gamma$ and $\bar h >h$ and different temperature scale $\bar T$ to be in equilibrium with the universal thermometer. The radiation law (1) then implies assuming that A is perfectly reflecting for frequencies above its own cut-off:
  • $\bar\gamma \bar T = \gamma T$                                         (2)
to serve as the connection between the temperature scales of BB and A. This gives (1) a form of universality with a universal $\gamma$ reflecting the use of a BB as a universal thermometer.

In reality the abrupt cut-off after at radiation maximum is replaced by a gradual decrease to zero over some frequency range as a case-specific post-max part of the spectrum.  A further case-specific element is non-perfect reflectivity above cut-off. Thermal equilibrium according to (2) is thus an ideal case.  

In particular, different bodies at the same distance to the Sun can take on different temperatures in thermal equilibrium with the Sun. Here the high-frequency part of the spectrum comes in as well as the route from non-equilibrium to equilibrium. 

Why CMB can have a perfect blackbody spectrum is hidden in the intricacies of the sensing. It may well reflect man-made universality. 

måndag 22 april 2024

Man-Made Universality of Blackbody Radiation 2

Man-made Universality of Shape

This is a clarification of the previous post on the perfect Planck blackbody spectrum of the Cosmic Microwave Background Radiation CMB as a 14 Billion years afterglow of Big Bang as the leading narrative of cosmology physics today. See also this recent post and this older illuminating post.

The Planck spectrum as the spectrum of an ideal blackbody, takes the form 
  • $E(\nu ,T) =\gamma T\nu^2\times C(\nu ,T)$                                         (1)
where $E (\nu ,T)$ is radiation intensity depending on frequency $\nu $ and temperature $T$, $\gamma$ a universal constant, and $C(\nu ,T)$ is a universal high frequency cut-off function of the specific form 
  • $C(\nu ,T)=\frac{x}{\exp(x)-1}$ with $x = \frac{\nu}{T}\times\alpha$       (2)
where $\alpha =\frac{h}{k}$ with $h$ Planck's constant and $k$ Boltzmann's constant as another universal constant, with the property that 
  • $C(\nu ,T)\approx 1$ for $x<<1$ and $C(\nu ,T)\approx 0$ for $x>>1$.  
We see that radiation intensity proportional to $T$ increases quadratically with $\nu$ in accordance with deterministic wave mechanics, and reaches a maximum shortly before a cut-off scaling with $T$ in accordance with statistics of energy quanta, which kicked off an idea of atom physics as quantum mechanics also based on statistics.    

Computational Blackbody Radiation offers a different version of high frequency cut-off motivated by finite precision physics/computation instead of statistics of quanta opening to a deterministic form of atom physics as real quantum mechanics. The underlying physics model in both cases is that of an atomic lattice capable of generating a continuous spectrum of vibrational frequencies.

The basic assumptions behind a Planck spectrum as an ideal are:
  1. Model: Atomic lattice.
  2. Equilibrium: All frequencies take on the same temperature.
  3. High-frequency universal cut-off: Statistics of energy quanta.  
Observation show that most real blackbody spectra substantially deviate from the Planck spectrum and so have their own signature reflecting specific atomic lattice, non-equilibrium and specific high frequency cut-off lower than the ideal. Graphite is just about the only substance showing a Planck spectrum. 

This was not welcome by physicists in search of universality, and so the idea was born of deciding the spectrum of a given material/body by putting it inside an empty box with graphite walls and measuring the resulting radiation peeping out from a little hole in the box, which not surprisingly showed to be a graphite Planck blackbody spectrum. 

Universality of radiation was then established in the same way as universality of shape can be attained by cutting everything into cubical shape as was done by the brave men cutting paving stone out of the granite rocks of the West Coast of Sweden, which is nothing but man-made universality.  

The line spectrum of a gas is even further away from a blackbody spectrum. The idea of CMB as an afterglow of a young Universe gas cloud with a perfect Planck blackbody as measured by the FIRAS instrument on the COBE satellite, serves as a corner stone of current Big Bang + Inflation cosmology. 

It is not far-fetched to suspect that also the COBE spectrum is man-made, and then also Big Bang + Inflation.

lördag 20 april 2024

Can Cosmic Microwave Background Radiation be Measured, Really?

The Cosmic Microwave Background radiation CMB is supposed to be a 14 billion year after-glow with perfect Planck blackbody spectrum at temperature $T=2.725$ Kelvin K of a Universe at $T=3000$ K dating back to 380.000 years after Big Bang. The apparent 1000-fold temperature drop from 3000 to 3 K is supposed to be the results of an expansion and not cooling.  

To get an idea of the magnitude of CMB let us recall that a Planck spectrum at temperature $T$ stretches over frequencies $\nu\sim T$ and  reaches maximum radiation intensity $E\sim T^3$ near the end with a high frequency cut-off over an interval $\frac{\nu}{T}\sim 1$ (notice exponential scale):



 

The $10^3$-fold temperature drop thus corresponds to a $10^9$ decrease of maximum intensity and $10^3$ decrease in spectrum width. Intensity over width decreases with a factor $10^6$ as a measure of precision in peak frequency. 

We understand that to draw conclusions concerning a 3000 K spectrum from a measured 3 K spectrum requires a very precision on the level of microKelvin or 0.000001 K. Is this really possible? Is it possible to reach the precision 2.725 K from intensity maximum? 

Why is modern physics focussed on measuring quantities which cannot be measured, like ghosts?

CMB was first detected as noise maybe from birds visiting antennas, but the noise persisted even after antennas were cleaned and then the conclusion was drawn that CMB must be left-over from Big Bang 14 billion years ago and not from any birds of today.  Big Bang is physics, while birds is ornithology. 

fredag 19 april 2024

The Ultra-Violet Catastrophe vs 2nd Law of Thermodynamics


Classical physics peaked in the late 19th century with Maxwell's equations aiming to describe all of electromagnetics as a form of continuum wave mechanics, but crumbled when confronted with the Ultra-Violet Catastrophe UVC of heat radiation from a body of temperature $T$ scaling like $T\nu^2$ with frequency $\nu$ threatening to turn everything into flames without an upper bound for frequencies, because wave mechanics did not seem to offer any escape from UVC.  

Planck took on role of saving physics from looming catastrophe, but could not find a resolution within deterministic wave mechanics and so finally gave up and resorted to statistical mechanics with high frequencies less likely in the spirit of Boltzmann's thermodynamics and 2nd Law with order less likely than disorder. 

There is thus a close connection between UVC and 2nd Law. Boltzmann would say that the reason we do not experience UVC is that high frequencies are not likely, but the physics of why is missing. Explaining that UVC is not likely would no explain why there is not any observation UVC whatsoever. 

I have followed a different route replacing statistics by finite precision physics for UVC (and similarly for 2nd Law), where high frequencies with short wave length cannot be radiated because finite precision sets a limit on the frequencies an atomic lattice can carry as coordinated synchronised motion. In this setting UVC can never occur.

A basic mission for a 2nd Law is thus to prevent UVC. This gives 2nd Law deeper meaning as a necessary mechanism preventing too fine structures/high frequencies to appear and so cause havoc. 2nd Law is thus not a failure to maintain order over time, but a necessary mechanism to avoid catastrophe from too much order. 

Similarly, viscosity and friction appear as necessary mechanisms destroying finite structure/order in order to let the World to continue, and so not only as defects of an ideal physics without viscosity and friction. This is the role of turbulence as described in Computational Turbulent Incompressible Flow and Computational Thermodynamics.

We can compare with the role of interest rate in an economy with zero interest rate of an ideal economy leading to catastrophe over time. If there is no cost of getting access to capital, any crazy mega project could get funding and catastrophe would follow. This was the idea 2008-2023 preceding the collapse predicted to 2025. Too little friction makes the wheels turn too fast. Too much idealism leads to ruin.

torsdag 18 april 2024

The Secret of Radiative Heat Transfer vs CMB and Big Bang

A main challenge to physicists at the turn to modernity 1900 was to explain radiative heat transfer as the process of emission, transfer and absorption of heat energy by electromagnetic waves described by Maxwell's equations. The challenge was to explain why real physics avoids an ultra-violet catastrophe with radiation intensity going to infinity with increasing frequency beyond the visible spectrum. 

More precisely, the challenge was to uncover the physics of a blackbody spectrum with radiation intensity scaling with $T\nu^2$ with $T$ temperature and frequency $\nu\le\nu_{max}$ with $\nu_{max}$ a cut-off frequency scaling with $T$, and intensity quickly falling to zero above cut-off. 

Planck as leading physicist of the German Empire took on the challenge and after much struggle came up with an explanation based on statistics of energy taking the above form as Planck's Law, which has served into our time as a cover up a failure to explain a basic phenomenon in physical terms. 

Computational Blackbody Radiation offers an explanation in terms of finite precision physics setting a cut-off (scaling with temperature) on the frequency of emission from coordinated oscillations of an atomic lattice, with uncoordinated atomic motion stored as heat energy.

In this analysis heat is transferred from a body of higher temperature  to a body of lower temperature through a resonance phenomenon analogous to the resonance between two tuning forks. The essence can be described in terms of a  forced acoustically weakly damped harmonic oscillator:

  • $\dot v(t)+\nu^2u(t)+\gamma v(t)=f(t)=sin(\bar\nu t)$ for $t>0$                    (1)
where $u(t)$ is displacement at time $t$, $v(t)=\dot u(t)$ is velocity, the dot represents derivative with respect to time $t$, $\nu$ is the frequency of the harmonic oscillator and $\bar\nu\approx\nu$ that of the forcing. For radiation the damping term takes the form $\gamma\ddot v(t)$. 

Mathematical analysis shows assuming small damping with $\gamma << 1$ and near resonance with $\nu\approx\bar\nu$ and integration over a period:
  • $Output = \gamma \int v^2(t)dt \approx \int f^2(t)dt = Input$         (2)
  • Velocity $v(t)$ out-of-phase with $f(t)$.                                                                (3)
Even if it looks innocent, (2) represents the essence of Planck's Law with (3) expressing basic physics: Out-of-phase means that the interacting between forcing and oscillator corresponds to a "pumping motion" with the forcing balanced mainly by the harmonic oscillator itself and not the damping. In the acoustic case $T=\int v^2(t)dt$ and thus $Output =\gamma T$, which in the case of radiation takes the form $Output = \gamma T\nu^2$ or Planck's Law. 

Sum up:
  • Radiative balance between two bodies of equal temperature is expressed by (2).
  • Heating of a body B1 with lower temperature from body B2 of higher temperature from frequencies above cut-off for B1.  
  • High frequency cut-off effect of finite precision physics and not statistics.
  • Blackbody spectrum is continuous (all frequencies) and requires atomic lattice. 
  • A gas ha a line spectrum with selected frequencies, which is not a blackbody spectrum.
  • Cosmic Microwave Background radiation as a perfect blackbody spectrum of an after-glow of Big Bang without atomic lattice appears as very speculative, with Big Bang itself as even more speculative beyond experimental confirmation.  

    tisdag 16 april 2024

    Does a Photon have Temperature?

    The idea about the Cosmic Microwave Background CMB radiation is conveyed to the public by authoritative sources as follows starting at the creation of the Universe with a Big Bang:

    • After about 380,000 years when the Universe had cooled to around 3000 Kelvin,  photons were able to move unhindered through the Universe: it became transparent.
    • Over the intervening 14 billion years, the Universe has expanded and cooled greatly. Due to the expansion of space, the wavelengths of the photons have grown (they have been ‘redshifted’) to roughly 1 millimetre and thus their effective temperature has decreased to just 2.7 Kelvin. 
    • These photons fill the Universe today (there are roughly 400 in every cubic centimetre of space) and create a background glow that can be detected by far-infrared and radio telescopes.
    We meet the idea that photons are moving through space like some form of particles with effective temperature of 2.7 K filling the Universe as an after-glow of Big Bang. 

    But the concept of photon lacks real physics. Light does not consist of a stream of light particles named photons, but is an electromagnetic wave phenomena and as such can have a frequency and an amplitude/intensity. An emitter of light like the Sun has a temperature, while the light emitted is characterised by its spectrum as intensity vs frequency. A spectrum can give information about the temperature of the emitter with the Planck spectrum the spectrum of an ideal blackbody at a certain temperature with in particular a high-frequency cut-off scaling linearly with temperature. 

    Emitted light can be recieved by an antenna through resonance recording the frequency. It is also possible to record the temperature of an emitter by connecting the antenna to a form of radiation thermometer reading temperature from radiative equilibrium, in the same way as a common thermometer reads the temperature of a source by direct contact/equilibrium.  

    But is more difficult to read a spectrum since properties of emissivity, transmissivity and absorptivity as well as view angles enter. In the absence of information a Planck spectrum is often assumed, but most emitters do not have blackbody spectra.

    A Big Bang emitter at 3000 K is thus postulated with an after-glow received as a blackbody spectrum of 3 K with frequency reduced and wave length increased by a factor of 1000 into far-infrared. 

    What is effectively measured is a combination of temperature and intensity, which shows up as a perfect blackbody spectrum. The message is that this is an after-glow of Big Bang, thus giving evidence to Big Bang: If there is an after-glow there must have been some glow to start with = Big Bang. More precisely, it is variations letting the antenna sweep the sky, which are measured and have to be given a physical meaning as some variability of the Early Universe. 

    The basic idea is thus that photons have been traveling through empty space for 14 billion years under a stretching of a factor 1000 but no other influence, and that collecting these photons gives a picture of the Early Universe. This appears as a lofty  speculation cleverly designed as to prevent inspection because both theory and instrumentation are hidden in mist. Here is the main picture from The Music of the Big Bang by Amedo Balbi: 


    The source is thus gone since 14 billion years, while the after-glow still surrounds us and can be measured. This is mind boggling. 

    Let us compare with the picture presented as Computational Blackbody Radiation, where emitter and receiver establish contact by resonance of electromagnetic waves and so take on the same temperature by reaching radiative equilibrium, in the same way as two distant tune forks can find an equilibrium.

    What about the time delay between emitter and receiver from finite speed of light? If a light source is switched on, it will take some time before it reaches a receiver. Is it the same when a light source is switched off? Do you feel being warmed even a while after the fire is dead? What about a solar eclipse? Does it take 8 minutes before we feel the cold? 

    In any case, the connection between Big Bang which is gone since 14 billion years and a proclaimed after-glow, which we can enjoy today from the presence of about 400 photons in every cubic centimetre of space at 3 K, appears as science fiction to me at least. 

    Radiation as electromagnetic waves needs a source to sustain over time. If the Big Bang source to CMB disappeared 14 billion years ago, the electromagnetic waves have so to speak have a life of their own over very long time, like a tsunami wave sweeping the Pacific long after the earth quake source has disappeared. Here the ocean acts as a physical medium carrying the energy, while a corresponding medium for electromagnetic waves as an aether has no physical presence. The energy is thus carried by the source some of which is transmitted to the receiver in resonance. 


    måndag 15 april 2024

    Modern Physics vs Homeopathy

    Modern physics appears as a form of homeopathy in reverse. A main idea of homeopathy is to obtain a major health effect from a very very diluted form of some substance, the smaller the better. The characteristic of modern physics, as rather the opposite, is identification of a very small effect from a very very large cause as in the following key examples  with increasing presence in later years (with year of Nobel Prize in Physics):

    • Theoretical and experimental discovery of very small deviation from Newton's mechanics in Einstein's mechanics. (no Prize)
    • Theoretical discovery of the Pauli Exclusion Principle impossible to verify experimentally. (1945)
    • Theoretical discovery of statistical interpretation of wave function impossible to verify experimentally. (1954)
    • Experimental discovery of Microwave Background. (1978)
    • Experimental discovery of very small fluctuation in temperature of Cosmic Microwave Background Radiation from Big Bang. (2006)
    • Theoretical discovery of broken symmetry predicting quarks impossible to verify experimentally. (2008)
    • Experimental discovery of accelerating expansion of the Universe from very weak data. (2011)
    • Experimental discovery of Higg's particle as origin of mass from very weak data. (2013)
    • Experimental discovery of very weak gravitational waves from collision of supernovae. (2017)
    • Theoretical discovery that black hole is a prediction of general relativity. (2020)
    • Theoretical discovery that global warming is a prediction of very little atmospheric CO2. (2021)
    • Theoretical discovery of string theory on very very small scales impossible to verify experimentally. (no Prize)
    It may seem that all notable effects have already been discovered and so only very very small remain to be discovered. The difficulty of connecting a very small effect to a very large cause (or vice versa) is that a very precise theory is needed in order to build a very precise instrument for experimental verification. Without theory experiment has no objective. Finding a needle in a haystack may be simpler. In addition to experimental discoveries of some vanishingly small effect, we also see Prizes to discoveries of theories beyond experimental verification because effects are so small.  

    When the Large Hadron Collider shows to be too small to find anything new of significance and public money for an even larger Super Collider cannot be harvested, physicists turn to use the whole Universe as test bench to find ever smaller effects. There are many things yet to be discovered on scales allowing detection, but this draws little interest from leading physicists focussed on what is infinitely large or infinitesimally small.  

    We may compare with the evaluation by Napoleon of the work in his administration of the mathematician Laplace as expert on Infinitesimal Calculus: 
    • He wanted to bring the spirit of infinitesimals into administration.



    söndag 14 april 2024

    Cosmic Microwave Background Radiation vs Big Bang?

    This is a continuation of a previous post on the same topic. The European Space Agency ESA sends this message to the people of Europe and the World:

    • The Cosmic Microwave Background (CMB) is the cooled remnant of the first light that could ever travel freely throughout the Universe.
    • Scientists consider it as an echo or 'shockwave' of the Big Bang. Over time, this primeval light has cooled and weakened considerably; nowadays we detect it in the microwave domain.
    More precisely, CMB is reported to be measured by the FIRAS Far Infrared Absolute Spectrophotometer (FIRAS) instrument on the COBE satellite as a very small temperature variation (18 $\mu K$) over a uniform background of a perfect blackbody spectrum at 2.725 $K$. The main difficulty is to isolate a very weak signal from very far away from more nearby signals including signals from the Earth atmosphere and oceans.  

    To understand the technology of the measurement, which is not easy, we take a look at the FIRAS instrument to see what it contains:


     What we see is in particular the following:
    • Sky Horn collecting input from the Sky.
    • Xcal reference blackbody used for calibration of Sky Horn input.
    • Ical reference blackbody for internal calibration.
    • Ical is equipped with two germanium resistance thermometers (GRT).
    • Xcal is monitored by three GRTs.
    • FIRAS = Far Infrared Absolute Spectrophotometer.
    The output of FIRAS consists of:
    • A very small temperature variation of size 0.00002 K over a background of 2.725 K.
    • The measured background spectrum is a perfect Planck blackbody spectrum. 
    CMB spectrum as perfect Planck blackbody spectrum. But low frequencies in the far infrared spectrum are missing! 

    We see warning signs: 
    • Very high precision is reported!
    • Perfect Planck blackbody spectrum is reported. But far infrared is missing. 
    • Calibration to nearly perfect real blackbodies is made. 
    • Temperature of 3 K from very far reported.  
    • Spectrum as radiative flux is reported (spectrophotometer).
    More understanding comes from plotting the spectrum in terms of frequency:


    We here see the COBE-FIRAS (blue) measures intensity at maximum around 200 GHz and a bit beyond for higher frequencies in the cut-off region, while the more essential part of the spectrum in the far infrared is missing. The intensity maximum around 200GHz according to Planck's law corresponds to a temperature of about 3 K, which however, since the essential part of the spectrum is missing, may as well correspond to much higher temperature at much lower emissivity.

    In previous posts we have reminded that measurement of temperature is possible by establishing radiative equilibrium between source and instrument/thermometer, but it requires disturbances between source and instrument to be small, which poses a challenge to directly measuring temperature of CMB from very far. 

    The alternative in this case is to report temperature from spectrum. But directly measuring radiative flux/spectrum can be even more challenging. Typically this is done (using bolometers and pyrometers) by measuring temperature, and then computing radiative flux/spectrum using Planck's law under assumptions hard to verify. This makes assessing CMB to a very daunting task from a mix of measurement and computation of temperature and radiative flux.

    The scenario is thus:
    • If a correct full spectrum is measured, a temperature can be determined from the frequency of maximal intensity. 
    • If only temperature is given, determining spectrum as radiative flux intensity, requires post processing. 
    • A measured/computed temperature of 3K attributed to a very far away source may be misleading.
    • Robitaille suggesting that the true origin of the the 3 K CMB is the oceans of the Earth at 300 K.  
    To sum up, we have on the table: 
    1. Very speculative Big Bang BB.
    2. CMB with questionable credibility, maybe noise from Earth Ocean,  
    The argument by mainstream physicists/cosmologists is now that since the main role of CMB is to serve as main evidence of Big Bang, and CMB shows to serve this role in such an excellent way, it gives credibility to CMB by being connected to something very big. BB thus supports CMB, which gives support to BB. 

    One possibility is then that both BB and CMB are real phenomena The other possibility is that both are free speculations by scientists in search of a mission. What is your impression? 

    PS Has COBE-FIRAS detected the same thing as WMAP and PLANCK further away from the Earth:


    Which picture is most credible? The more details, the more credible? What happens with small details over time according to the 2nd Law? 



    fredag 12 april 2024

    Computing with Real Numbers

    This is a continuation of the previous post How to avoid collapse of modern mathematics.

    Let me see if the constructive computational approach to mathematics adopted in the BodySoul program can meet the criticism expressed by Norman Wildberger as concerns the foundations of the large areas of mathematics relying on the concept of real number.  In particular Wildberger asks about the elementary process of adding two real numbers such as $\sqrt{2}$ and $\pi$: 

    • $\sqrt{2}+ \pi = ?$ 
    Let us then use the least cryptic definition of a real number as an infinite decimal expansion. But asking for the infinite decimal expansion of $\sqrt{2}$ is asking too much, and so we have to limit the specification to a finite number of decimals, and the same with $\pi$. We can then add these numbers using well specified rules for computing with rational numbers, and so arrive at a finite decimal expansion as an approximation of $\sqrt{2}+ \pi$. We can choose the number of decimals to meet a given precision requirement. Fair enough. 

    But how do we know the decimal expansions of $\sqrt{2}$ and $\pi$?. Before the computer they would have to be picked up from a printed precomputed mathematical table, but only up to finitely many decimals and the table would swell beyond all limits by asking for more and more decimals. Today with the computer, you can press a button and let $\sqrt{2}$ be computed from scratch using Newton's method, but even if this algorithm is very efficient, the required work/time would increase beyond limit by asking for more and more decimals. 

    The computer would compute the sum $\sqrt{2}+ \pi$ in an iterative computational process involving:
    1. Compute $\sqrt{2}$ with say $5$ decimals.
    2. Compute $\pi$ with say $5$ decimals.
    3. Add these decimal expansions using the addition algorithm for finite decimal expansions.
    4. Check if a desired precision is met, and if not go back to 1. and increase the number of decimals.  
    This would reduce the foundation of mathematics to computational processes, and this is the approach of BodySoul: All mathematical objects are constructed by specified finitary computational processes as finite precision solutions to specified equations. 

    For example, the value of the exponential function $\exp(t)$ for any value $t>0$ is computed by solving the differential equation $x^\prime (s)=x(s)$ for $0<s\le t$ with $x(0)=1$ by time stepping, where $x^\prime $ is the derivative of $x$, and setting $\exp(t)=x(t)$. No values of $\exp(t)$ are stored. New computation from scratch for each value of $t$. This is the only way to avoid storing real numbers as infinite decimal expansions, which is impossible in a finite Universe. 

    Is Wildberger happy with such a response to his criticism. And what about you?

    In any case, pure mathematicians will not welcome a foundation based on non-pure computational mathematics, even if it would solve unresolved foundational questions concerning real numbers and elementary functions of real numbers as solutions to differential equations. 

    There was a tough fight at the turn to to modernity in the beginning of the 20th century concerning the foundations of mathematics between logicism (Russell), formalism (Hilbert) and constructivism/intuitionism (Brouwer), which was won by Hilbert in the 1930s thus setting the scene for 21st century mathematics. But with the computer, constructivism is now taking over by offering a concrete foundation without lofty speculation of infinities.  

    A formalist can introduce $\mathcal{R}$ as the set of equivalence classes of all Cauchy sequences of rational numbers thus as a set defined by a certain property. Russell showed the danger of defining sets this lofty way by his famous example of a set defined by the property of not containing itself leading to a contradiction. Gödel turned Russell's example into more precise form, which should have killed both the logicist and formalist school, but did not since the reaction was to kick out constructionists compatible with Gödel from mathematics departments to form separate departments of computer science. Mathematics departments/education is still controlled by formalists, which means that Wildberger's criticism is not welcome.    

    torsdag 11 april 2024

    Is Cosmic Microwave Background Radiation Measurable?

    Temperature fluctuations of CMB measured by COBE satellite. 


    Pierre-Marie Robitaille leading the development of the 8 Tesla Ultra High Field human MRI (Magnetic Resonance Imaging) scanner, used his deep knowledge of electromagnetic resonance to question the measurement of the Cosmic Microwave Background Radiation (CMB) by NASA's COBE satellite, awarded the Nobel Prize in 2008. 

    This was not well received by the physics community and Robitaille was effectively cancelled academically (as far as I understand), but his very informative youtube channel Sky Scholar (with 50k subscribers and 145 videos) has survived. Take a look and compare with previous post on Man Made Universality of Black Body Radiation.

    CMB is supposed to be the "cooled remnant of the first light that could ever travel freely throughout the Universe" at the very low temperature of 2.726 K above ultimate 0 K. Very cold indeed. More precisely, it is claimed that measured CMB spectrum is very close to a blackbody spectrum at 2.726 K. 

    In previous posts I have posed the question if the spectrometer involved in measuring CMB is effectively measuring temperature or radiative flux, with the answer that temperature can be measured at distance by radiative equilibrium in the same way a thermometer in contact measures temperature and not heat flux by establishing radiative equilibrium. This is supported by the fact that it is a measured temperature of 2.726 K, which is the main characteristic of the postulated CMB, not its unknown radiative heat emission as a (small) possible contribution to global warming. Recalling previous posts and Robitaille, we know that the blackbody spectrum is a fiction only met by graphite and so one may ask why CMB could behave the same. 

    In the view presented on Computational Blackbody Radiation the temperature measurement by NASA's COBE satellite as main evidence of the existence of CMB, is based on resonance between apparatus and cosmic background, which has to be singled out from all other resonances. Robitaille here presents the Oceans of the Earth as a possible source overwhelming CMB, thus questioning the existence of CMB.  

    When your brain registers a sound arising from resonance between a sound source and eardrum, the direction to the source can be decided because you have two ears, but the distance to the source and so the origin of the sound is more difficult to determine in the presence of other possibly stronger sources.  Robitaille questions the possibility to single out CMB from the radiation from the Oceans.  Do you?

    How to Avoid Collapse of Modern Mathematics

    Pythagoras struggling in vain to avoid collapse surrounded by a worried Society.

    This is a continuation on a previous post about Norman Wildberger's mathematics education program Insights into Mathematics noting connections the Leibniz World of Mathematics and the BodySoul program. 

    A common concern is the concept of real number and the set of real numbers $\mathcal{R}$ as the playground for most of modern mathematics. Wildberger takes a critical look on how these concepts are introduced in standard texts noting that basic difficulties are swept under the rug. View in particular this episode: Real numbers as Cauchy sequences does not work.

    BodySoul takes a constructive approach viewing the natural numbers 1, 2, 3,..., to be constructed by repetition of the operation +1, the integers as solutions to equations $x+n=m$ with $n$ and $m$ natural numbers, the rational numbers as solutions to equations $q*x=p$ denoted $x=\frac{p}{q}$ with $p$ and $q\neq 0$ integers, while the real number $\sqrt{2}$ is defined as the positive solution to the equation $x^2=2$ or  $x*x=2$.

    Recall that the Pythagorean society based on the concepts of natural and rational number, collapsed when it became public that $\sqrt{2}$ is not a rational number. Modern mathematics is based on the concept of  $\mathcal{R}$ as the set of all real numbers. Wildberger concludes that all attempts to bring rigour into the foundations of mathematics as the virtue of modern mathematics including Dedekind cuts, equivalence classes of Cauchy sequences and infinite sequences of decimal expansions, have failed. The trouble with all these attempts is the resort to infinities in different form. What will be the fate of the society of modern mathematics when this fact becomes public?

    In the constructive approach of BodySoul there is no need to introduce infinities: In particular it is sufficient to work with rational numbers as finitely periodic decimal expansions or even more restrictive as finite decimal expansionswhich makes perfect sense to anybody. But it requires making the notion of solution of an equation like $x*x=2$ precise, that is making precise the meaning of the equality sign $=$. 

    We then have to make the distinction between exact equality or more precisely logical identity denoted $\equiv$ and numerical equality denoted by the usual equality $=$ as something different to be defined. We thus have $A\equiv A$ while writing $A=B$ would mean that $B$ is not identical to $A$ but equal in some restricted meaning to be defined. 

    We then understand that $x\equiv\frac{1}{3}$ as exact solution to the equation $3*x=1$, while $x=0.333333333$ is a solution in a restricted meaning. We meet the same situation as concerns the solution to the equation $x*x=2$ with $x=1.414$ and $x=1.41421356$ as solutions in a restricted sense, or approximate solutions of different quality or accuracy. 

    To measure the quality of a given approximate solution $x$ to the equation $x*x=2$, it is natural to evaluate the residual $res(x)=x*x-2$ and then from the value of $res(x)$ seek to evaluate the quality of $x$. This can be measured by the derivative $f^\prime (x)=2*x$ of the function $f(x)=x*x-2$, noting that a different approximate solution $\bar x$ is connected to $x$ by the mean-value theorem 

    • $res(x)-res(\bar x) = f(x)-f(\bar x) = f^\prime (\hat x)*(x-\bar x)$     

    where $\hat x$ lies between $x$ and $\bar x$. With knowledge that $x>1$ and $\bar x>$, we can conclude that $f^\prime (\hat x)>2$ and so

    • $\vert x-\bar x\vert<\frac{1}{2}\vert res(x)-res(\bar x)\vert$

    from which the quality of approximate solutions can be measured in terms of the residuals with $\frac{1}{2}$ as sensitivity factor. 

    This analysis generalises to to approximate solution to equations $f(x)=0$ for general functions $f(x)$ with the derivate $\frac{1}{f^\prime (x)}$ expressing residual sensitivity. In particular we see that if $f^\prime (x)$ is small the sensitivity is large asking the residual to be very small to reach precision in $x$. 

    But this argument is not central in modern mathematics where the notion of exact solution to an equation is viewed as the ideal. The exact/ideal solution to the equation $x*x=2$ would thus be viewed as a non-periodic infinite decimal expansion, which would require an infinite amount work to be determined, thus involving the infinities which Wildberger questions. The equality sign in this setting comes without quality measure in finite terms as an unattainable (Platonic) ideal. 

    In the setting of the algebraic equation $x*x=2$ the notion of an ideal solution may not cause much confusion, but for more general equations such as partial differential equations it has generated a lot of confusion because the quality aspect of approximate solutions is missing. The quality of an ideal solution is infinite beyond measurement but also beyond construction.  

    There is a notion in modern mathematical analysis of partial differential equations named well-posedness with connects to the sensitivity aspect of approximate solutions, but it has received little attention in quantitative terms.  

    As a remedy, this is the central theme of the books Computational Turbulent Incompressible Flow and Computational Thermodynamics. There is much to say about mathematical equations and laws of physics with finite precision.

    We may compare the Pythagoreans facing the equation $x*x=2$ with a notion of ideal solution, and modern mathematics hitting a wall confronted with the Clay Math Institute Millennium Problem on ideal solutions of  Navier Stokes equations. 

    An opening in this wall is offered as Euler's Dream come true

    PS Recall the famous Kronecker quote: "God made the integers, all the rest is the work of man". So the power of an almighty God was not enough to proceed and also make the real numbers. What are the prospects that man can succeed?

     

    torsdag 4 april 2024

    Doubling Down in Modern Physics and Politics

    Doubling Down in Sweden.

    If you feel you are losing a poker game, you may double the bet in the hope of not being called. This is risky but may be the only alternative to losing. Basic examples from modern physics as the fundament of science: 

    • Einstein confronted with questions about special relativity and its strange clocks and meter sticks from 1905, which he could not answer, countered by presenting in 1916 his general theory of relativity and then applied it to the cosmology of the whole Universe as a theory of maximal dimension, which could not be called. 
    • When the Standard Model of atom physics ran out of steam in the 1970s, string theory was presented as a Theory of Everything on physical scales of size $10^{-35}$ meters way below every possibility of  experimental detection, which neither could be called.      
    Examples from world politics:
    • At the end of WWII when the Ardenne offensive in Jan 1945 was broken, Hitler doubled down: "I know the war is lost. The enemy's superiority is too great. We won't surrender, never. We can go under. But we'll take the world with us". 
    • Nato's proxy war against Ukraine will continue "til the last Ukrainian" towards a WWWIII double down, now supported by once neutral Sweden. Peace negations is not an option for the West.
    Rationality was lost in the modern physics of the West by doubling down instead of resolving basic issues. The consequences have been far reaching and irrationality has now taken over Western geopolitics, including that of little once rational neutral Sweden. Is there any hope? 

    PS1 Glenn Diesen gives in his latest book The Ukraine War and Eurasian World Order a rational sharp analysis:
    • Five hundred years of Western hegemony has ended, while the global majority’s aspiration for a world order based on multipolarity and sovereign equality is rising. This incisive book addresses the demise of liberal hegemony, though pointing out that a multipolar Westphalian world order has not yet taken shape, leaving the world in a period of interregnum. A legal vacuum has emerged, in which the conflicting sides are competing to define the future order. NATO expansionism was an important component of liberal hegemony as it was intended to cement the collective hegemony of the West as the foundation for a liberal democratic peace. Instead, it dismantled the pan-European security architecture and set Europe on the path to war without the possibility of a course correction. Ukraine as a divided country in a divided Europe has been a crucial pawn in the great power competition between NATO and Russia for the past three decades. The war in Ukraine is a symptom of the collapsing world order. The war revealed the dysfunction of liberal hegemony in terms of both power and legitimacy, and it sparked a proxy war between the West and Russia instead of ensuring peace, the source of its legitimacy. The proxy war, unprecedented sanctions, and efforts to isolate Russia in the wider world contributed to the demise of liberal hegemony as opposed to its revival. Much of the world responded to the war by intensifying their transition to a Eurasian world order that rejects hegemony and liberal universalism. The economic architecture is being reorganised as the world diversifies away from excessive reliance on Western technologies, industries, transportation corridors, banks, payment systems, insurance systems, and currencies. Universalism based on Western values is replaced by civilisational distinctiveness, sovereign inequality is swapped with sovereign equality, socialising inferiors is replaced by negotiations, and the rules-based international order is discarded in favour of international law. A Westphalian world order is reasserting itself, although with Eurasian characteristics. The West’s defeat of Russia would restore the unipolar world order while a Russian victory would cement a multipolar one. The international system is now at its most dangerous as the prospect of compromise is absent, meaning the winner will take all. Both NATO under US direction and Russia are therefore prepared to take great risks and escalate, making nuclear wan increasingly likely.
    PS2 Listen to Pelle Neroth Taylor show from Sweden in intervju with Dimitri Orlov and Edward Lozansky from April 4th: https://tntradio.live/shows/pelle-neroth-taylor/

    onsdag 3 april 2024

    Temperature or Radiative Flux? Pressure or Convective Flux?

    The prime evidence put forward to support global warming alarmism is a measurement of the spectrum of  Outgoing Longwave Radiation OLR from the Earth into cold empty space showing that total OLR is 1% less than total incoming short wave radiation from the Sun with the message that the Earth is heating up. 

    The measurement is done from satellites (AIRS and ISIS) looking down on the atmosphere using instruments in the form of spectrometers supposedly measuring radiative fluxes over a range of infrared frequencies forming the spectrum, see previous posts on OLR.  The instruments use bolometers based on thermopiles sensitive to temperature and compute radiative fluxes using complex software for radiative heat transfer such as Modtran. The instruments thus directly measure temperature and then report radiative flux after postprocessing. To confirm global warming,  better accuracy than 1% is required for total OLR and also of course for total incoming radiation. Is this possible?

    To compute radiative flux from input of temperature using software for radiative heat transfer requires input of coefficients of emissivity, absorptivity and transmissivity, and so has serious issues as concerns accuracy. 

    To see a basic issue, let us compare with a more familiar setting of seeking to compute the fluid flux in a pipe or around an object from reading pressure. We then recall that pressure can be read by a pitot tube:

    where the left open end is inserted into the fluid and so takes on the stagnation pressure or total pressure in the fluid passing by and then is read by a nanometer to the right. We understand that measuring pressure can be done with high precision, but if we now ask about the total convective fluid flux, we will have to supply additional information about the nature of the flow. If the flow is steady, inviscid, incompressible and irrotational, then Bernouilli's  Law can be used to compute fluid velocity and so convective flux, but that is a very special case. 

    We thus see that reading temperature by a thermopile or pressure by a Pitot tube can be done directly by an instrument for which the physics is transparent. 

    On the other hand, to report radiative/convective flux from reading of temperature/pressure is a complicated issue requiring detailed additional information, which may not be available. 

    Direct measurement of total flux by some form of capturing technique like that in an anemometer, also is subject to very big uncertainties.  

    Supporting global warming on an assessment of OLR measured to less 1% precision lacks credibility. 

    måndag 1 april 2024

    How to April Fool Yourself

    Today April 1st it is the right day to recall the post from 2011 How to Fool Yourself with a Pyrgeometer with related posts connecting to the recent sequence of posts on temperature vs radiation.

    Thus you should go to a Clas Ohlson Store and buy yourself a pyrgeometer or infrared thermal camera and direct it to the atmosphere and read that the instrument on its display reports as Downwelling Longwave Radiation from the atmopsphere of about 330 Watts per square meter supposedly then hitting everything on the Earth surface, twice as much as the 170 W/m2 coming in as short wave radiation from the Sun.

    Or direct the instrument to the ground and read that the Earth gives off 290 W/m2 as Upwelling Longwave Radiation, almost twice what comes in from the Sun.

    What's going on? Have you been fooled by the instrument, or are you too smart for that understanding very well how a infrared thermal camera works? What does the instrument in fact directly measure? Temperature  or radiation?

    Once you have figured that out, you can now go ahead to fool your neighbour.  

    You may compare with a potentially equally shocking misreading of a thermometer mixing Celsius with Fahrenheit or even worse with Kelvin.



    söndag 31 mars 2024

    AIRS Atmospheric Infrared Sounder Measuring Temperature

    This is a follow up to the previous post on a debate with Will Happer concerning satellite measurement of Earth atmosphere: What is directly measured at distance: temperature or radiation? My view is that temperature is directly measured and so can be reliable, while radiation is computed using some complex software for radiative heat transfer and so is unreliable. It seems that Happer is not perfectly happy with such a clear statement but does not give a clear alternative. 

    Let us see take a look at the most advanced system, which is the AIRS Atmospheric Infrared Sounder monitored by NASA presented as follows:

    • AIRS is the first instrument ever to produce a three dimensional map of temperature and water vapour in the atmosphere directly measured from satellites


    We understand AIRS directly measures temperature at distance and that this can be very useful information!

    We recall that there are several instruments like bolometers, pyrgeometers and infrared cameras which read temperature at distance typically using a thermopile sensor taking on source temperature at one end by radiative equilibrium at distance (like a thermometer in contact) and instrument reference temperature at the other end thereby reporting a voltage depending on temperature difference, thus reporting source temperature. 

    Why is then Happer speaking about measuring radiation? It is because global warming alarmism is based on an idea that the Earth absorbs more energy from the Sun than it emits to outer space as Outgoing Longwave Radiation OLR. Evidence is then presented as measured incoming and outgoing radiation, of size around 340 W/m2, from which a difference of less 1% is obtained and reported as alarming. That requires high precision measurement outgoing radiation from direct measurement and so it must be tempting to believe that this is what AIRS offers. But it does not.  

    Happer is not a climate alarmist, but he seems to be stuck with the alarmist dream of measuring OLR to a very high precision. Strange.

    PS We may compare reading temperature vs radiation with determining a persons bank net vs determining the power of the person. The bank net can be directly read which is not possible for power. 

    Similarly, all lake side properties around the lake share the level of the lake, while their value is more difficult to decide.  

    lördag 30 mars 2024

    Spooky Action at Distance in Global Warming

    This is a follow up of a discussion with prof Will Happer on Outgoing Longwave Radiation OLR from the Earth into outer space, which determines global warming or cooling, as concerns measurement of temperature and radiation by AIRS spectrometers in satellites looking down on the atmosphere with output (clear sky): 


    We see a graph of radiative flux as function of frequency on a background of corresponding blackbody spectra at varying temperatures from 225 K at the top of the troposphere, over 255 K in the middle and 288 at the Earth surface. We see major radiation from H20 for lower frequencies at temperatures around 255 K,  from CO2 at 225 K and from the Earth surface at 288 K through the atmospheric window.

    This graph is presented as the essential scientific basis of climate alarmism with the ditch in the spectrum giving CO2 a substantial role even if H20 and the window has major role. But the change in the ditch by doubling CO2 from preindustrial level is much smaller of size 1% of total incoming radiation from the Sun. 

    In any case the measured spectrum of OLR by AIRS serves as key evidence of global warming by human CO2 emissions, but it requires an accuracy of less than 1%. 

    Is this the case? We recall the spectrometer of AIRS is based on the bolometer which is an instrument measuring temperature in some frequency band at distance, from which radiation is computed using Modtran as software to solve Schwarzschild's equations of radiative transfer line by line. This is a complex computation involving coefficients of emissivity and absorptivity which are not precisely known. There are many posts on this topic under Schwarzschild and OLR and bolometer. Results are reported as radiative forcing from increasing CO2, typical of size 1% of total incoming. 

    Thus temperature is directly measured while radiation is the result of a complex computation for which an accuracy of less than 1% is required. You have to be a believer in global warming to believe that this accuracy is met. In other words, the evidence of global warming supposedly being presented by the OLR spectrum is not convincing if you have the slightest inclination towards skepticism.

    Back to Happer, who claims that it does not not matter what is directly measured, since there is a connection between temperature and radiation, and so one may as well view that AIRS measures radiation. Our discussion came to halt at this point. 

    But to me it is clear that a bolometer (or pyrgeometer) is an instrument which directly measures temperature and if the instrument reports radiation, it is the result of a computation of unknown accuracy, which more precisely can be grossly misleading. In other words, reported temperature is reliable while reported radiation is not. 

    The key observation is that CO2 radiation is measured to have temperature 225 K which means that it comes from the top of the atmosphere as the highest level where presence of CO2 is detected by the AIRS bolometer, with higher levels being transparent. 

    The radiative forcing of 1% is thus based on a computation for which the accuracy is not known to be less than 1%. Your conclusion? 

    The key question is then what can measured at distance, temperature or radiation? There are several instruments that can directly measure temperature at distance, such as infrared cameras, bolometers and pyrogeometers, all based on radiative equilibrium at distance rationalised as Computational Blackbody Radiation. This is an analog to measuring temperature by a thermometer in contact.  

    But there are no instruments directly measuring radiation by some kind of photon capturing technique. Believing that this is possible represents belief in a some form of spooky action at distance. And you? And Happer?

    PS In a letter to Max Born in 1947 Einstein said of the statistical approach to quantum mechanics, which he attributed to Born: I cannot seriously believe in it because the theory cannot be reconciled with the idea that physics should represent a reality in time and space, free from spooky action at a distance. This is  a different setting than that considered here: Reading temperature at distance is not spooky. Reading radiation is spooky action at distance.


    tisdag 26 mars 2024

    Man-Made Universality of Blackbody Radiation

    Pierre-Marie Robitaille is one of few physicists still concerned with the physics of blackbody radiation supposed to be the first expression of modern physics as presented by Max Planck in 1900, as expressed in this article and this article and  this talk.

    Robitaille points to the fact that a blackbody is a cavity/box $B$ with interior walls covered with carbon sooth or graphite. Experiments show that the spectrum of the radiation $B_r$ from a little hole of such a cavity only depends on frequency $\nu$ and temperature $T$ according to Planck's Law:

    • $B_r=\gamma T\nu^2$   if $\nu <\frac{T}{h}$  and $B_r=0$ else,       (P)     
    where $\gamma$ and $h$ are universal constants, and we refer to $\nu <\frac{T}{h}$ as high-frequency cut-off. 

    Experiments show that putting any material body $\bar B$  inside the cavity will not change (P), which is seen as evidence that the spectrum of $\bar B$ is the same as that of $B$  independent of the nature of $\bar B$ as an expression of universality. 

    This is questioned by Robitaille, but not by main-stream physicists. Robitaille insists that the spectrum depends on the nature of the body. 

    Let us see what we can say from our analysis in Computational Blackbody Radiation. We there identify a perfect blackbody to have a spectrum given by (P) with $\gamma$ maximal and $h$ minimal, thus by maximal radiation and maximal cut-off. By experiment we determine that graphite is a good example of a perfect blackbody. By maximality a blackbody spectrum dominates all greybody spectra.

    Let then a greybody $\bar B$ be characterised by different constants $\bar\gamma (\nu)=\epsilon (\nu)\gamma$ with $0<\epsilon (\nu) <1$ a coefficient of emissivity = absorptivity possibly depending on $\nu$, and $\bar h >h$. The radiation spectrum of $\bar B$ is given by 

    • $\bar B_r=\epsilon (\nu)\gamma T\nu^2$  if $\nu <\frac{T}{\bar h}$ and $\bar B_r=0$ else.

    This is not universality since $\epsilon (\nu)$ and $\bar h$ depend on the nature of $\bar B$. 

    But let us now put $\bar B$ at temperature $\bar T$ inside the cavity $B$ with graphite walls acting as a blackbody and let $B$ take on the the same temperature (assuming $\bar B$ has much bigger heat capacity than $B$) with thus

    • $\bar B_r=\epsilon (\nu)B_r$ for $\nu<\frac{\bar T}{\bar h}$ and $\bar B_r=0$ else.
    We then measure the spectrum of the radiation from the hole, which is the blackbody spectrum of $B_r$:
    • $B_r=\gamma\nu^2$ for $\nu<\frac{\bar T}{h}$ and $B_r=0$ else.
    If we then insist  that this is the spectrum of $\bar B$, which it is not, we get a false impression of universality of radiation. By maximality with $h<\bar h$ the cavity spectrum $B_r$ dominates $\bar B_r$.
     
    We conclude that the universality of blackbody radiation is a fiction reflecting a dream of physicists to capture existential essence in universal terms. It comes from using the cavity as a transformer of radiation from a greybody to a blackbody pretending that the strange procedure of putting objects into cavity with graphite walls to measure their spectrum, is not strange at all. 

    We may compare with US claiming that the dollar $D$ represents a universal currency backing that by imposing an exchange rates $\epsilon <1$ for all other currencies $\bar D$, thus imposing the dollar as the universal currency for the the whole World forgetting that all currencies have different characteristics. This gives the FED a man-made maximal universally dominating role, which is now challenged... 

    PS1 To meet criticism that painting the walls of the cavity with graphite may be seen as a rigging of the measurement of radiation through the hole, physicists recall that removing the graphite and letting the walls be covered with perfect reflectors, will give the same result, if only a piece of graphite is left inside the cavity. This shows to be true, but the piece of graphite is necessary and its effect can be understood from the maximality of blackbody radiation independent of object size. 

    PS2 Recall radiation spectra of solid state is continuous while gasses have discrete spectra. Also recall that measuring spectra typically is done with instruments like bolometer or pyrgeometer, which effectively measure temperature from which radiation is computed according to some Planck law which may but usually does not represent  reality. Atmospheric radiation spectra play an important role in climate modelling, and it is important to take them with a grain of salt, since what is de facto measured is temperature with radiation being computed according to some convenient formula serving man-made climate alarmism.  

    PS3 The Sun has a continuous spectrum and so probably consists of liquid metallic hydrogen. Main-stream physics tells that it has a gaseous plasma state.

    Thermodynamics of Friction

    Everything goes around in construction-deconstruction-construction...

    In the previous post we considered viscosity in laminar and turbulent flow and friction between solid bodies as mechanisms for irreversible transformation of large scale kinetic motion/energy into small scale kinetic motion/energy in the form of heat energy, noting that the transformation cannot be reversed since the required very high precision cannot be realised, everything captured in a 2nd Law of Thermodynamics.  

    Let us consider the generation of heat energy in friction when rubbing your hands or sliding an object over a floor or pulling the handbrakes of your bicycle. We understand that the heat energy is created from the work done by force times displacement (in the direction of the force), like pressing/pushing a sandpaper over the surface of a piece of wood to smoothen the surface by destroying its granular micro-structure. Work is thus done to destroy more or less ordered micro-structure and the work shows up as internal heat energy as unordered micro-scale kinetic energy. 

    The key is here destruction of micro-structure into heat energy in a process which cannot be reversed since the required precision cannot me met.

    Skin friction between a fluid and solid acts like friction between solids. 

    Turbulent flow transforms large scale ordered kinetic energy into small-scale unordered kinetic energy as heat energy under the action of viscous forces. Laminar flow also generates heat energy from friction between layers of fluid of different velocity.

    In all these cases heat energy is generated from destruction/mixing of order/structure in exothermic irreversible processes. This destruction is balanced by constructive processes like synchronisation of atomic oscillations into radiation and emergence of ordered structures like vortices in fluid flow and endothermic processes of unmixing/separation. 

    We thus see exothermic processes of destruction followed by endothermic construction, which is not reversed deconstruction, with different time scales where deconstruction is fast and brutal without precision and construction is slow with precision. This is elaborated in The Clock and the Arrow in popular form. Take a look.

     

    måndag 25 mars 2024

    Norman Wildberger: Insights into Mathematics


    Mathematician Norman Wildberger presents an educational program for a wide audience as Insights into Mathematics connecting to the principles I have followed in Body and Soul and Leibniz World of Math.

    A basic concern of Wildberger is how to cope with real numbers underlying analysis or calculus, geometry, algebra and topology, since they appear to require working with aspects of infinities coming with difficulties, which have never been properly resolved, like computing with decimal expansions with infinitely many decimals and no last decimal to start a multiplication. Or the idea of an infinity of real numbers beyond countability.

    I share the critique of Wildberger but I take a step further towards a resolution in terms of finite precision computation, which can be seen to be the view of an applied mathematician or engineer. In practice decimal expansions with a finite number of decimals are enough to represent the world and every representation can be supplied with a measure of quality as a certain number of decimals as a certain finite precision. This offers a foundation of mathematics without infinities in the spirit of Aristotle with infinities as never attained  potentials representing "modes of speaking" rather than effective realities. 

    In particular the central concept of "continuum" takes the form of a computational mesh of certain mesh size or finite precision. With this view a "continuum" has no smallest scale yet is finite and there is a hierarchy of continua with variable mesh size.    

    The difficulty of infinities comes from an idea of exact physical laws and exact solutions to mathematical equations like $x^2=2$ expressed in terms of symbols like $\sqrt{2}$ and $\pi$. But this can be asking for too much, even if it is tempting, and so lead to complications which have to be hidden under the rug creating confusion for students.

    A more down-to-earth approach is then to give up exactness and be happy with finite precision not asking for infinities.