söndag 31 mars 2024

AIRS Atmospheric Infrared Sounder Measuring Temperature

This is a follow up to the previous post on a debate with Will Happer concerning satellite measurement of Earth atmosphere: What is directly measured at distance: temperature or radiation? My view is that temperature is directly measured and so can be reliable, while radiation is computed using some complex software for radiative heat transfer and so is unreliable. It seems that Happer is not perfectly happy with such a clear statement but does not give a clear alternative. 

Let us see take a look at the most advanced system, which is the AIRS Atmospheric Infrared Sounder monitored by NASA presented as follows:

  • AIRS is the first instrument ever to produce a three dimensional map of temperature and water vapour in the atmosphere directly measured from satellites


We understand AIRS directly measures temperature at distance and that this can be very useful information!

We recall that there are several instruments like bolometers, pyrgeometers and infrared cameras which read temperature at distance typically using a thermopile sensor taking on source temperature at one end by radiative equilibrium at distance (like a thermometer in contact) and instrument reference temperature at the other end thereby reporting a voltage depending on temperature difference, thus reporting source temperature. 

Why is then Happer speaking about measuring radiation? It is because global warming alarmism is based on an idea that the Earth absorbs more energy from the Sun than it emits to outer space as Outgoing Longwave Radiation OLR. Evidence is then presented as measured incoming and outgoing radiation, of size around 340 W/m2, from which a difference of less 1% is obtained and reported as alarming. That requires high precision measurement outgoing radiation from direct measurement and so it must be tempting to believe that this is what AIRS offers. But it does not.  

Happer is not a climate alarmist, but he seems to be stuck with the alarmist dream of measuring OLR to a very high precision. Strange.

PS We may compare reading temperature vs radiation with determining a persons bank net vs determining the power of the person. The bank net can be directly read which is not possible for power. 

Similarly, all lake side properties around the lake share the level of the lake, while their value is more difficult to decide.  

lördag 30 mars 2024

Spooky Action at Distance in Global Warming

This is a follow up of a discussion with prof Will Happer on Outgoing Longwave Radiation OLR from the Earth into outer space, which determines global warming or cooling, as concerns measurement of temperature and radiation by AIRS spectrometers in satellites looking down on the atmosphere with output (clear sky): 


We see a graph of radiative flux as function of frequency on a background of corresponding blackbody spectra at varying temperatures from 225 K at the top of the troposphere, over 255 K in the middle and 288 at the Earth surface. We see major radiation from H20 for lower frequencies at temperatures around 255 K,  from CO2 at 225 K and from the Earth surface at 288 K through the atmospheric window.

This graph is presented as the essential scientific basis of climate alarmism with the ditch in the spectrum giving CO2 a substantial role even if H20 and the window has major role. But the change in the ditch by doubling CO2 from preindustrial level is much smaller of size 1% of total incoming radiation from the Sun. 

In any case the measured spectrum of OLR by AIRS serves as key evidence of global warming by human CO2 emissions, but it requires an accuracy of less than 1%. 

Is this the case? We recall the spectrometer of AIRS is based on the bolometer which is an instrument measuring temperature in some frequency band at distance, from which radiation is computed using Modtran as software to solve Schwarzschild's equations of radiative transfer line by line. This is a complex computation involving coefficients of emissivity and absorptivity which are not precisely known. There are many posts on this topic under Schwarzschild and OLR and bolometer. Results are reported as radiative forcing from increasing CO2, typical of size 1% of total incoming. 

Thus temperature is directly measured while radiation is the result of a complex computation for which an accuracy of less than 1% is required. You have to be a believer in global warming to believe that this accuracy is met. In other words, the evidence of global warming supposedly being presented by the OLR spectrum is not convincing if you have the slightest inclination towards skepticism.

Back to Happer, who claims that it does not not matter what is directly measured, since there is a connection between temperature and radiation, and so one may as well view that AIRS measures radiation. Our discussion came to halt at this point. 

But to me it is clear that a bolometer (or pyrgeometer) is an instrument which directly measures temperature and if the instrument reports radiation, it is the result of a computation of unknown accuracy, which more precisely can be grossly misleading. In other words, reported temperature is reliable while reported radiation is not. 

The key observation is that CO2 radiation is measured to have temperature 225 K which means that it comes from the top of the atmosphere as the highest level where presence of CO2 is detected by the AIRS bolometer, with higher levels being transparent. 

The radiative forcing of 1% is thus based on a computation for which the accuracy is not known to be less than 1%. Your conclusion? 

The key question is then what can measured at distance, temperature or radiation? There are several instruments that can directly measure temperature at distance, such as infrared cameras, bolometers and pyrogeometers, all based on radiative equilibrium at distance rationalised as Computational Blackbody Radiation. This is an analog to measuring temperature by a thermometer in contact.  

But there are no instruments directly measuring radiation by some kind of photon capturing technique. Believing that this is possible represents belief in a some form of spooky action at distance. And you? And Happer?

PS In a letter to Max Born in 1947 Einstein said of the statistical approach to quantum mechanics, which he attributed to Born: I cannot seriously believe in it because the theory cannot be reconciled with the idea that physics should represent a reality in time and space, free from spooky action at a distance. This is  a different setting than that considered here: Reading temperature at distance is not spooky. Reading radiation is spooky action at distance.


tisdag 26 mars 2024

Man-Made Universality of Blackbody Radiation

Pierre-Marie Robitaille is one of few physicists still concerned with the physics of blackbody radiation supposed to be the first expression of modern physics as presented by Max Planck in 1900, as expressed in this article and this article and  this talk.

Robitaille points to the fact that a blackbody is a cavity/box $B$ with interior walls covered with carbon sooth or graphite. Experiments show that the spectrum of the radiation $B_r$ from a little hole of such a cavity only depends on frequency $\nu$ and temperature $T$ according to Planck's Law:

  • $B_r=\gamma T\nu^2$   if $\nu <\frac{T}{h}$  and $B_r=0$ else,       (P)     
where $\gamma$ and $h$ are universal constants, and we refer to $\nu <\frac{T}{h}$ as high-frequency cut-off. 

Experiments show that putting any material body $\bar B$  inside the cavity will not change (P), which is seen as evidence that the spectrum of $\bar B$ is the same as that of $B$  independent of the nature of $\bar B$ as an expression of universality. 

This is questioned by Robitaille, but not by main-stream physicists. Robitaille insists that the spectrum depends on the nature of the body. 

Let us see what we can say from our analysis in Computational Blackbody Radiation. We there identify a perfect blackbody to have a spectrum given by (P) with $\gamma$ maximal and $h$ minimal, thus by maximal radiation and maximal cut-off. By experiment we determine that graphite is a good example of a perfect blackbody. By maximality a blackbody spectrum dominates all greybody spectra.

Let then a greybody $\bar B$ be characterised by different constants $\bar\gamma (\nu)=\epsilon (\nu)\gamma$ with $0<\epsilon (\nu) <1$ a coefficient of emissivity = absorptivity possibly depending on $\nu$, and $\bar h >h$. The radiation spectrum of $\bar B$ is given by 

  • $\bar B_r=\epsilon (\nu)\gamma T\nu^2$  if $\nu <\frac{T}{\bar h}$ and $\bar B_r=0$ else.

This is not universality since $\epsilon (\nu)$ and $\bar h$ depend on the nature of $\bar B$. 

But let us now put $\bar B$ at temperature $\bar T$ inside the cavity $B$ with graphite walls acting as a blackbody and let $B$ take on the the same temperature (assuming $\bar B$ has much bigger heat capacity than $B$) with thus

  • $\bar B_r=\epsilon (\nu)B_r$ for $\nu<\frac{\bar T}{\bar h}$ and $\bar B_r=0$ else.
We then measure the spectrum of the radiation from the hole, which is the blackbody spectrum of $B_r$:
  • $B_r=\gamma\nu^2$ for $\nu<\frac{\bar T}{h}$ and $B_r=0$ else.
If we then insist  that this is the spectrum of $\bar B$, which it is not, we get a false impression of universality of radiation. By maximality with $h<\bar h$ the cavity spectrum $B_r$ dominates $\bar B_r$.
 
We conclude that the universality of blackbody radiation is a fiction reflecting a dream of physicists to capture existential essence in universal terms. It comes from using the cavity as a transformer of radiation from a greybody to a blackbody pretending that the strange procedure of putting objects into cavity with graphite walls to measure their spectrum, is not strange at all. 

We may compare with US claiming that the dollar $D$ represents a universal currency backing that by imposing an exchange rates $\epsilon <1$ for all other currencies $\bar D$, thus imposing the dollar as the universal currency for the the whole World forgetting that all currencies have different characteristics. This gives the FED a man-made maximal universally dominating role, which is now challenged... 

PS1 To meet criticism that painting the walls of the cavity with graphite may be seen as a rigging of the measurement of radiation through the hole, physicists recall that removing the graphite and letting the walls be covered with perfect reflectors, will give the same result, if only a piece of graphite is left inside the cavity. This shows to be true, but the piece of graphite is necessary and its effect can be understood from the maximality of blackbody radiation independent of object size. 

PS2 Recall radiation spectra of solid state is continuous while gasses have discrete spectra. Also recall that measuring spectra typically is done with instruments like bolometer or pyrgeometer, which effectively measure temperature from which radiation is computed according to some Planck law which may but usually does not represent  reality. Atmospheric radiation spectra play an important role in climate modelling, and it is important to take them with a grain of salt, since what is de facto measured is temperature with radiation being computed according to some convenient formula serving man-made climate alarmism.  

PS3 The Sun has a continuous spectrum and so probably consists of liquid metallic hydrogen. Main-stream physics tells that it has a gaseous plasma state.

Thermodynamics of Friction

Everything goes around in construction-deconstruction-construction...

In the previous post we considered viscosity in laminar and turbulent flow and friction between solid bodies as mechanisms for irreversible transformation of large scale kinetic motion/energy into small scale kinetic motion/energy in the form of heat energy, noting that the transformation cannot be reversed since the required very high precision cannot be realised, everything captured in a 2nd Law of Thermodynamics.  

Let us consider the generation of heat energy in friction when rubbing your hands or sliding an object over a floor or pulling the handbrakes of your bicycle. We understand that the heat energy is created from the work done by force times displacement (in the direction of the force), like pressing/pushing a sandpaper over the surface of a piece of wood to smoothen the surface by destroying its granular micro-structure. Work is thus done to destroy more or less ordered micro-structure and the work shows up as internal heat energy as unordered micro-scale kinetic energy. 

The key is here destruction of micro-structure into heat energy in a process which cannot be reversed since the required precision cannot me met.

Skin friction between a fluid and solid acts like friction between solids. 

Turbulent flow transforms large scale ordered kinetic energy into small-scale unordered kinetic energy as heat energy under the action of viscous forces. Laminar flow also generates heat energy from friction between layers of fluid of different velocity.

In all these cases heat energy is generated from destruction/mixing of order/structure in exothermic irreversible processes. This destruction is balanced by constructive processes like synchronisation of atomic oscillations into radiation and emergence of ordered structures like vortices in fluid flow and endothermic processes of unmixing/separation. 

We thus see exothermic processes of destruction followed by endothermic construction, which is not reversed deconstruction, with different time scales where deconstruction is fast and brutal without precision and construction is slow with precision. This is elaborated in The Clock and the Arrow in popular form. Take a look.

 

måndag 25 mars 2024

Norman Wildberger: Insights into Mathematics


Mathematician Norman Wildberger presents an educational program for a wide audience as Insights into Mathematics connecting to the principles I have followed in Body and Soul and Leibniz World of Math.

A basic concern of Wildberger is how to cope with real numbers underlying analysis or calculus, geometry, algebra and topology, since they appear to require working with aspects of infinities coming with difficulties, which have never been properly resolved, like computing with decimal expansions with infinitely many decimals and no last decimal to start a multiplication. Or the idea of an infinity of real numbers beyond countability.

I share the critique of Wildberger but I take a step further towards a resolution in terms of finite precision computation, which can be seen to be the view of an applied mathematician or engineer. In practice decimal expansions with a finite number of decimals are enough to represent the world and every representation can be supplied with a measure of quality as a certain number of decimals as a certain finite precision. This offers a foundation of mathematics without infinities in the spirit of Aristotle with infinities as never attained  potentials representing "modes of speaking" rather than effective realities. 

In particular the central concept of "continuum" takes the form of a computational mesh of certain mesh size or finite precision. With this view a "continuum" has no smallest scale yet is finite and there is a hierarchy of continua with variable mesh size.    

The difficulty of infinities comes from an idea of exact physical laws and exact solutions to mathematical equations like $x^2=2$ expressed in terms of symbols like $\sqrt{2}$ and $\pi$. But this can be asking for too much, even if it is tempting, and so lead to complications which have to be hidden under the rug creating confusion for students.

A more down-to-earth approach is then to give up exactness and be happy with finite precision not asking for infinities.  

How to Generate Heat Energy


We recall from a previous post:

  • Heat energy can be generated from large scale kinetic energy by compression. 
  • Kinetic energy can be generated from heat energy by expansion.

More precisely, we saw in the previous saw that heat energy at high temperature can generate useful mechanical work. Heat energy at high temperature can be created by nuclear/chemical reactions. 

Heat energy typically at lower temperatures also appears as losses from electrical currents subject to resistance, fluid motion subject to turbulent/laminar viscosity and friction between solid bodies. These losses appear as substantial, unavoidable and irreversible as expressions of a 2nd Law.

We have seen that heat energy $\sim T\nu^2$ of frequency $\nu$ carried by an atomic lattice of temperature $T$ subject to high-frequency cut-off $\nu <\frac{T}{h}$ expressing ordered synchronised atomic oscillation or kinetic motion, can be radiated. Here $h$ is a constant. 

We can view turbulent dissipation in fluid flow as a form of high-frequency forcing above present cut-off which cannot be reradiated and so is absorbed as internal energy in the form of unordered small scale kinetic energy. We can similarly view viscosity and friction as forms of high-frequency forcing supplying internal energy. 

The contribution to internal energy increases the temperature and so allows unordered small scale motion to be synchronised to higher frequency and then radiated.  

The key is thus that turbulent, viscous and frictional dissipation all represent high-frequency forcing above   present cut-off, which cannot be represented and reradiated and so shows up as internal energy as small scale kinetic energy. 

Rubbing hands is one way to transform large scale kinetic motion into small scale kinetic motion as heat energy. The brakes on your car work the same way. 

söndag 24 mars 2024

Exergy as Energy Quality


Kinetic energy, electrical energy, chemical and nuclear energy can all be converted fully into heat energy, while heat energy can only be partially converted back again. This is captured in the 2nd Law of Thermodynamics. We can thus say that heat energy is of lower quality compared with the other forms. More generally, the term exergy is used as a measure of quality of energy of fundamental importance for all forms of life and society as ability to do work.

We can make this more precise by recalling that the quality of heat energy comes to expression in radiative and conductive heat transfer from a body B1 of temperature $T_1$ to a neighbouring body B2 of lower  temperature $T_2<T_1$ in basic cases according to Stefan-Boltzmann's Law or Fourier's Law:

  • $Q = (T_1^4-T_2^4)$            (SB)
  • $Q = (T_1-T_2)$                    (F)
with $Q$ heat energy per time unit. Heat energy of higher temperature thus can be considered to have higher quality than heat energy of lower temperature, which of course also plays role in conversion of heat energy to other forms of energy. The maximal efficiency of a heat engine operating between $T_1$ and $T_2$ and transforming heat energy to mechanical work, is equal to $\frac{T_1-T_2}{T_1}$ displaying the higher quality of $T_1$.

Heat energy at high temperature is the major source for useful mechanical work supporting human civilisation, while heat energy at lower temperatures appears as a useless loss e g in the cooling of a gasoline engine.

But what is the real physics behind (SB) and (F)? This question was addressed in a previous post viewing (F) to be a special case of (SB) with the physics behind (SB) displayed in the analysis of Computational Blackbody Radiation

The essence of this analysis is a high-frequency cut-off $\frac{T}{h}$ allowing a body of temperature $T$ to only emit frequencies $\nu <\frac{T}{h}$, where $h$ is a constant. This allows a body B1 of temperature $T_1$ to transfer heat energy to a body B2 of lower temperature $T_2$ via frequencies $\frac{T_2}{h}<\nu <\frac{T_1}{h}$, which cannot be balanced by emission from B2.  

High frequency cut-off increasing linearly with temperature represents Wien's displacement law (W), giving improved exergy with increasing temperature.

The high-frequency cut-off can be seen as an expression of finite precision limiting the frequency being carried and emitted by an oscillating atomic lattice in coordinated motion, with frequencies above cut-off being carried internally as heat energy as uncoordinated motion

Higher temperature thus connects to higher quality heat energy or better exergy. The standard explanation of this basic fact is based on statistical mechanics, which is not physical mechanics. 

PS Radiative heat transfer without high-frequency cut-off would boil down to (F), while (SB) is what is observed, which gives support to (W).


fredag 22 mars 2024

Thermodynamics of War and Peace


Opposing ordered armies at the moment before turbulent destruction. 

The recent posts on 2nd law of thermodynamics describe a process where increasing spatial gradients eventually reach a level (from convection and opposing flow) where further increase is no longer possible because it would bring the process to brutal stop, and so some form of equilibration of spatial differences must set in where 

  • each particle tends to take on the mean-value of neighbouring particles.   (M)

This is the process in turbulent fluid flow transforming ordered large scale kinetic energy into small scale disordered kinetic energy taking the form internal heat energy in a turbulent cascade of turbulent dissipation. Here (M) is necessary to avoid break-down into a stop. The flow or show must go on.

(M) is also the essence of the diffusion process of heat conduction seeking to decrease gradients, even if not absolutely necessary as in turbulent fluid flow.

It is natural to connect turbulence to the violent break-down of large scale ordred structures into rubble in a war necessarily resulting from escalation of opposing military forces in direct confrontation which at some level cannot be further escalated and so have to be dissipated in a war. 

It is then natural to connect the equilibration (M) in heat conduction to a geopolitical/parliamentary process in peace time, where each country/party takes on the mean value of neighbouring countries/parties keeping gradients small. 

While (M) is necessary in turbulence to let the flow go on, one may ask what the physics of (M) in the case of heat conduction, and find answer in this post. 

The mathematics is elaborated in: 

The geopolitical/parliamentary situation today evolves towards sharpened gradients, while politicians refuse to follow (M) and so there is a steady march towards break-down... 


onsdag 20 mars 2024

Secret of Conductive and Radiative Heat Transfer

This is a continuation of the previous post on Heat Conduction in Solids as Radiative Heat Transfer with  clarifying analysis from Mathematical Physics of Blackbody Radiation and Computational Blackbody Radiation.

The key aspect of both conductive and radiative heat transfer is interaction in a coupled system of weakly damped oscillators of different frequencies tending to an equilibrium with all oscillators having the same temperature as the system temperature. The damping can be frictional (1st order time derivative) or radiative (3rd order time derivative) 

There are two main questions: (i) Why do different systems take on the same temperature? (ii) Why do oscillators with different frequencies in a system take on the same temperature?  

The answer is hidden in the interaction between incoming radiation, oscillator and outgoing radiation in a weakly radiatively damped oscillator analysed in detail in the above texts. The essence is that under near resonance between incoming frequency and oscillator frequency,  

  • incoming radiation is balanced by outgoing radiation plus internal heating. 
This is a non-trivial basic fact reflecting that the forcing and oscillator are out-of-phase with a shift of half a period as a consequence of small radiative damping and near resonance. 

Two coupled oscillators thus interact with outgoing from one oscillator acting as incoming for the other and vice versa and so are led to take on the same temperature, which is then spread over the oscillators of a system and also over systems. 

The essential components in this equilibration process are thus
  • weakly damped oscillators generating outgoing radiation and internal heating 
  • out-of-phase balance between forcing and damping from near resonance
  • high-frequency cut-off increasing with temperature from finite precision computation.  
This analysis connects to Planck's derivation of his law of radiation with statistics replaced by finite precision thus replacing non-physics by physics. 

tisdag 19 mars 2024

Sverige Måste Vinna Kriget mot Ryssland?

Sverige besegrar Ryssland i slaget vid Narva 1701 under ledning av Karl XII.  

Idag skriver ett antal svenska ambassadörer och militärer på SvD Debatt:

  • Regeringen bör skyndsamt hitta formerna för att mångfalt öka det militära biståndet till Ukraina. 
  • Inför den ryska aggressionen befinner sig de västliga demokratierna som de befann sig inför Hitler.
  • Kriget kommer inte att sluta så länge Putin sitter vid makten. 
  • Enda sättet att avsluta kriget är alltså att Putin förlorar makten.
  • Det kan ske om han lider ett så svidande nederlag...
  • Putins avlägsnande är bara ett nödvändigt...kännbart ryskt nederlag i Ukraina.
  • Ett sådant nederlag har den västliga demokratiska alliansen i sina händer. Hos den finns de vapen och de resurser med vilka Ukraina kan vända kriget och tillfoga Ryssland ett nederlag.
  • Ukraina måste segra och Ryssland måste förlora.
  • Vi vädjar till regeringen att tänka utanför den ärvda boxen – det gäller att skyndsamt hitta formerna för att mångfalt öka det militära biståndet till Ukraina.

Vi ser här att segern i Narva 1701 är i kärt minne i den "ärvda boxen", medan förlusten i Poltava 1711 och förlusten av Finland 1809 är bortglömda. Tanken är alltså att vi kan besegra Ryssland, som är den starkaste kärnvapenmakten i världen med nyvunnen konventionell militär styrka om mer än en million tränade soldater vida överlägsen den som västliga demokratier kan skrapa ihop, där Sverige kan bidra med kanske 3000 otränade pojkar och flickor, och därmed genomföra den erövring av Ryssland som Karl XII, Napoleon och Hitler misslyckades med.  

Speglar detta stämningarna hos det ledande skiktet av svenska politiker, ämbetsmän, militärer och företagare och svenska folket? Alla partier verkar vara inställda på krig även V och Mp. Fredsrörelsen har somnat. 

USA är på väg att dra sig ur, vilket ger Sverige en ledande roll som krigförande, naturligtvis tillsammans med Finland som är bra på krigföring i snö.

Påven ber Ukraina hissa vit flagg och gå till förhandlingsbordet där Putin väntar. Fortsatt krig gör bara saken värre för Ukraina och Väst. Ett mångfalt ökat militärt bistånd till Ukraina leder till WW3. 

Hur kan någon tro att Sverige kan vinna krig mot Ryssland idag? Eller tänker man att eftersom WW2 var bra för Sverige, så kan WW3 också vara det? I så fall bör man tänka en gång till.

En stark kärnvapenmakt kan inte besegras på hemmaplan. Det bästa resultat som kan uppnås är oavgjort som MAD Mutual Assured Destruction.


Heat Conduction in Solids as Radiative Heat Transfer


What is the physics of heat conduction in a solid like a metal? The trivial story is that "heat flows from warm to cold" or "there is a flux of heat from warm to cold" which scales with the temperature difference or gradient. 

But heat is not a substance like water in a river flowing from high-altitude/warm to low-altitude/cold, which connects to the caloric theory and also to phlogiston theory presenting fire as form of substance, both debunked at the end of the 18th century. 

In any case there is a law of physics named Fourier's Law:

  • $q =- \nabla u$          (F)
which combined with a law of conservation 
  • $\nabla\cdot q = f$
leads to the following heat equation (here in stationary state for simplicity) in the form of Poisson's equation
  • $-\Delta u = f$.                    (H)
where $u(x)$ is temperature and $f(x)$ heat source depending on a space variable $x$, and $q(x)$ is named "heat flux" although it has no physical meaning; heat is not any substance which flows or is in a state of flux. 

Let us now seek the physics of (F) in the case of metallic body as a lattice of atoms, and so seek an explanation of the observation that the temperature distribution $u(x)$ of the body tends to an equilibrium state with $u(x)=U$ with $U$ a constant (assuming no interaction with the surrounding and no internal heating for simplicity). 

We thus ask: 
  • What is the physics of the process towards equilibrium with constant temperature?
  • How is heat transferred from warm to cold?
  • Why is (F) valid?
We then recall our analysis of radiative transfer of energy at distance in a system of bodies/parts separated in space which (without external forcing) leads to an equilibrium state with all bodies having the same temperature, based on the following physical model:
  • Each body is a vibrating lattice of atoms described by a wave equation with small radiative damping. 
  • The bodies interact by electromagnetic waves through resonance. 
  • There is a high-frequency cut-off increasing linearly with temperature with the effect that heat transfer mediated by electromagnetic waves between two bodies, is one-way from high temperature to low temperature. 
The key is here the high-frequency cut-off increasing with temperature, which makes heat transfer one-way. The cut-off can be seen as a form of finite precision threshold allowing coordinated lattice vibration only below cut-off, thus allowing a high temperature lattice to carry higher frequencies. It is like a warmed-up opera soprano being able to reach higher frequencies.

We can view metallic body as a system composed of parts/atoms interacting by 
electromagnetic waves at small distance. 

Heat conduction will then come out as a special case of electromagnetic heat transfer 
between atoms of different temperature with high-frequency cut-off guaranteeing one-way transport as expressed by (F) and exposed above. 

Note that the reference text Conduction of Heat in Solids by Carslaw and Jaeger presents (F) as an ad hoc physical law without physics.

Recall that the standard explanation of radiative heat transfer from warm to cold is based on statistics without physics, which if used to explain heat conduction would again invoke statistics without physics thus not very convincing. 

Also note that the standard explanation of heat transfer in a gas involves collisions of molecules of different kinetic energy, which is not applicable to a metal with atoms in a lattice.

PS1 Fluid flow in a river from higher to lower altitude is driven by pressure. "Heat flow" from warm to cold is not driven by pressure and so the physics is different. 

PS2 Also compare with one-way osmotic transport of material driven by pressure. 

söndag 17 mars 2024

Universality of Radiation with Blackbody as Reference


One of the unresolved mysteries of classical physics is why the radiation spectrum of a material body only depends on temperature and frequency and not on the physical nature of the body, as an intriguing example of universality. Why is that? The common answer given by Planck is statistics of energy quanta, an answer however without clear physics based on ad hoc assumptions which cannot be verified experimentally as shown by this common argumentation. 

I have pursued a path without statistics based on clear physics as Computational Blackbody Radiation in the form of near resonance in a wave equation with small radiative damping as outgoing radiation, subject to external forcing  $f_\nu$ depending on frequency $\nu$ , which shows the following radiance spectrum $R(\nu ,T)$ (with more details here) characterised  by a common temperature $T$,  radiative damping parameter $\gamma$, and $h$ defines a high-frequency cut-off.  Radiative equilibrium with incoming = outgoing radiation shows to satisfy:

  • $R(\nu ,T)\equiv\gamma T\nu^2 =\epsilon f_\nu^2$ for $\nu\leq\frac{T}{h}$,
  • $R(\nu ,T) =0$ for $\nu >\frac{T}{h}$,
where $0<\epsilon\le 1$ is a coefficient of absorptivity = emissivity, while frequencies above cut-off $\frac{T}{h}$ cause heating. The radiation can thus be described by the coefficients $\gamma$, $\epsilon$ and $h$ and the temperature scale $T$. 

Here $\epsilon$ and $h$ can be expected to depend on the physical nature of the body, with a blackbody defined by $\epsilon =1$ and $h$ minimal thus with maximal cut-off. 

Let us now consider possible universality of the radiation parameter $\gamma$ and temperature $T$.

Consider then two radiating bodies 1 and 2 with different characteristics $(\gamma_1,\epsilon_1, h_1, T_1)$ and $(\gamma_2,\epsilon_2, h_2, T_2)$, which when brought into radiative equilibrium will satisfy (assuming here for simplicity that $\epsilon_1=\epsilon_2$):
  • $\gamma_1T_1\nu^2 = \gamma_2T_2\nu^2$ for $\nu\leq\frac{T_2}{h_2}$ 
  • assuming $\frac{T_2}{h_2}\leq \frac{T_1}{h_1}$ 
  • and for simplicity that 2 reflects frequencies $\nu > \frac{T_2}{h_2}$.    
If we choose body 1 as reference, to serve as an ideal reference blackbody, defining a reference temperature scale $T_1$, we can then calibrate the temperature scale $T_2$ for body 2 so that 
  • $\gamma_1T_1= \gamma_2T_2$,
thus effectively assign temperature $T_1$ and $\gamma_1$ to body 2 by radiative equilibrium with body 1 acting as a reference thermometer. Body 2 will then mimic the radiation of body 1 in radiative equilibrium and a form of universality with body 1 as reference will be achieved, with independence of $\epsilon_1$ and $\epsilon_2$.

The analysis indicates that the critical quality of the reference blackbody is maximal cut-off (and equal temperature of all frequencies), and not necessarily maximal absorptivity = emissivity = 1. 

Universality of radiation is thus a consequence of radiative equilibrium with a specific reference body in the form of a blackbody acting as reference thermometer.  

Note that the form of the radiation law $R(\nu ,T)= \gamma T\nu^2$ reflects that the radiative damping term in the wave equation is given by $-\gamma\frac{d^3}{dt^3}$ with a third order time derivative as universal expression from oscillating electric charges according to Larmor.

In practice body 1 is represented by a small piece of graphite inside a cavity with reflecting walls represented by body 2 with the effect that the cavity will radiate like graphite independent of its form or wall material. Universality will thus be reached by mimicing of a reference, viewed as an ideal blackbody, which is perfectly understandable, and not by some mysterious deep inherent quality of blackbody radiation. Without the piece of graphite the cavity will possibly radiate with different characteristics and universality may be lost.

We can compare many local currencies calibrated to the dollar as common universal reference.  
  • All dancers which mimic Fred Astaire, dance like Fred Astaire, but all dancers do not dance like Fred Astaire.     
PS1 The common explanation for the high frequency cut-off is that they have low probability, which is not physics, while I suggest that high frequencies cannot be represented because of finite precision, which can be physics.  

PS2 Note that high-frequency cut-off increasing with temperature gives a 2nd Law expressing that energy is radiated from warm to cold and to no degree from cold to warm, thus acting like semi-conductor allowing an electrical current only if a voltage difference is above a certain value.

torsdag 14 mars 2024

Geopolitics of Mathematical Physics


Swedish Prime Minister offering to build 1000 Swedish Tank90.

Let us identify some leading scientist/period/country in the history of mathematical physics as the foundation of science:

  • Archimedes: Greece 
  • Song Dynasty: China 
  • Galileo: Renaissance Italy
  • Leibniz and Euler: Calculus Scientific Revolution Holy Roman Empire
  • Newton: Calculus Scientific Revolution England 
  • Cauchy and Fourier: Scientific/French Revolution France
  • Maxwell: Industrial Revolution England 
  • Boltzmann: statistical mechanics German Empire 
  • Hilbert: mathematics German Empire 
  • Planck: radiation German Empire
  • Einstein: relativity German Empire, Weimar Republic
  • Schrödinger: quantum mechanics atomic physics Weimar Republic  
  • Oppenheimer: atomic bomb USA
  • Feynman: subatomic physics USA
  • Gell Mann, Weinberg: subatomic physics USA
  • Witten: string theory subsubatomic physics USA
  • China? India? Third World?
We see that geopolitical dominance comes along with leading science. No wonder, since science gives industrial and military power. We can follow a shift of power from the Holy Roman Empire (Germany) to France/England back to German Empire over Weimar Republic followed by USA until China and India and the Third World now are ready to step in.  

We can also follow a development from macroscopis of engines to microscopics of atoms to the string theory of today, along with an expansion from our Solar system to cosmology of the Universe. 

Modern physics is viewed to have started with Planck's derivation of his radiation law based on statistical mechanics as a break-away from the rationality of the classical deterministic mechanics serving the scientific/industrial revolution,  which after 20 years of incubation developed into the new quantum mechanics of atom physics, with the atomic bomb as triumph. 

Modern physics thus in 1900 took a step away from deterministic to statistical mechanics, from rationality to irrationality apparently breaking with the tradition from Archimedes to Hilbert carried by rational mathematical thinking, which led the way into the 20th century. 

Questions:
  1. Can it be that the roots of the observed irrationality of the 20th century with two World Wars, and today a 21st century on its way to a third, can be traced to the irrationality of modern physics? 
  2. Will the dominance of USA persist with the present focus on string theory?
  3. Will the rationalism of ancient China lead to a take over?
A more detailed perspective on modern physics politics is given in Dr Faustus of Modern Physics.

2nd Law or 2-way Radiative Heat Transfer?

In the present discussion of the 2nd Law of Thermodynamics let us go back to this post from 2011 exposing the 19th century battle between 1-way transfer (Pictet) vs 2-way transfer (Prevost) of heat energy by radiation playing a central role in climate science today. 

In 1-way transfer a warm body heats a colder body in accordance with the 2nd Law. 

In 2-way heat transfer both warm and cold bodies are viewed to heat each other, but the warm heats more and so there is a net transfer of heat energy from warm to cold. But a cold body heating a warm body violates the 2nd Law of thermodynamics, and so there is something fishy here. 

Yet the basic mathematical model of radiative heat transfer in the form of Schwarzschild's equation involve 2-way transfer of heat energy, in apparent violation of the 2nd Law. 

I have discussed this situation at length on this blog with tags such as 2nd law of thermodynamics and radiative heat transfer with more on Computational Blackbody Radiation.

If you worry about the 2nd Law, you can ask yourself how 2-way radiative heat transfer is physically possible, when it appears to violate the 2nd Law? What is false here: 2nd Law or 2-way heat transfer?

What is your verdict? 


tisdag 12 mars 2024

Philosophy of Statistical Mechanics?

Collapsed pillars of modern building.

Let us continue with the post Three Elephants of Modern Physics taking a closer look at one of them. 

We learn from Stanford Encyclopedia of Philosophy the following about Statistical Mechanics SM: 

  • Statistical Mechanics is the third pillar of modern physics, next to quantum theory and relativity theory
  • Its aim is to account for the macroscopic behaviour of physical systems in terms of dynamical laws governing the microscopic constituents of these systems and probabilistic assumptions.
  • Philosophical discussions in statistical mechanics face an immediate difficulty because unlike other theories, statistical mechanics has not yet found a generally accepted theoretical framework or a canonical formalism. 
  • For this reason, a review of the philosophy of SM cannot simply start with a statement of the theory’s basic principles and then move on to different interpretations of the theory.
This is not a very good start, but we continue learning: 
  • Three broad theoretical umbrellas: “Boltzmannian SM” (BSM), “Boltzmann Equation” (BE), and “Gibbsian SM” (GSM).
  • BSM enjoys great popularity in foundational debates due to its clear and intuitive theoretical structure. Nevertheless, BSM faces a number of problems and limitations
  • There is no way around recognising that BSM is mostly used in foundational debates, but it is GSM that is the practitioner’s workhorse.
  • So what we’re facing is a schism whereby the day-to-day work of physicists is in one framework and foundational accounts and explanations are given in another framework.
  • This would not be worrisome if the frameworks were equivalent, or at least inter-translatable in relatively clear way...this is not the case.
  • The crucial conceptual questions (concerning BE) at this point are: what exactly did Boltzmann prove with the H-theorem?
This is the status today of the third pillar of modern physics formed by Boltzmann 1866-1906 and Gibbs 1902 as still being without a generally accepted theoretical framework, despite 120 years of deep thinking by the sharpest brains of modern physics. 

Is this something to worry about? If one the pillars apparently is shaky, what about the remaining two pillars? Who cares?

Recall that SM was introduced to rationalise the 2nd Law of Thermodynamics stating irreversibility of macroscopic systems based on deterministic reversible exact microscopics. This challenge was taken up by Boltzmann facing the question: If all components of a system are reversible, how can it be that the system is irreversible? From where does the irreversibility come? The only way forward Boltzmann could find was to replace exact determinism of microscopics by randomness/statistics as a form of inexactness. 
 
In the modern digital world the inexactness can take the form of finite precision computation performed with a certain number of digits (e g single or double precision). Here the microscopics is deterministic up the point of keeping only a finite number of digits, which can have more or less severe consequences on  macroscopic reversibility. This idea is explored in Computational Thermodynamics offering a 2nd Law expressed in the physical quantities of kinetic energy, internal energy, work and turbulent dissipation without need to introduce any concept of entropy.

Replacing SM by precisely defined finite precision computation gives a more solid third pillar. But this is new and not easily embraced by analytical/theoretical mathematicians/physicists not used to think in terms of computation, with Stephen Wolfram as notable exception.  

PS1 To meet criticism that the Stosszahlansatz underlying the H-theorem stating that particles before collision are uncorrelated, simply assumes what has to proved (irreversibility), Boltzmann argued:
  • But since this consideration has, apart from its tediousness, not the slightest difficulty, nor any special interest, and because the result is so simple that one might almost say it is self-evident I will only state this result.
Convincing?

PS2 Connecting to the previous post, recall that the era of quantum mechanics was initiated in 1900 by Planck introducing statistics of "energy quanta" inspired by Boltzmann's statistical mechanics, to explain observed atomic radiation spectra, opening the door to Born's statistical interpretation in 1927 of the Schrödinger wave function as the "probability of finding an electron" at some specific location in space and time, which is the text book wisdom still today. Thus the pillar of quantum mechanics is also weakened by statistics. The third pillar of relativity is free of statistics, but also of physics, and so altogether the three pillars offer a shaky foundation of modern physics.  Convinced? 

måndag 11 mars 2024

The 2nd Law as Radiative Heat Transfer


The 2nd Law of Thermodynamics states that heat energy $Q$ without forcing, is transferred from a body of temperature $T_1$ to a body of temperature $T_2$ with $T_1>T_2$ by conduction according to Fourier's Law if the bodies are in contact: 

  • $Q =\gamma (T_1-T_2)$ 

and/or by radiation according Stephan-Boltzmann-Planck's Law if the bodies are not in contact as radiative heat transfer

  • $Q=\gamma (T_1^4-T_2^4)$        (SBP)
where $\gamma > 0$.

The energy transfer is irreversible since it has a direction from warm to cold with $T_1>T_2$. It is here possible to view conduction as radiation at close distance and thus reduce the discussion to radiation. 

We can thus view the 2nd Law to be a consequence of (SBP), at least in the case of two bodies of different temperature: There is an irreversible transfer of heat energy from warm to cold. 

To prove 2nd Law for radiation thus can be seen to boil down to prove (SBF). This was the task taken on by the young Max Planck, who after a long tough struggle presented a proof in 1900, which he however was very unhappy with, since it like Boltzmann's H-theorem from 1872 was based on statistical mechanics and not classical deterministic physical mechanics.

But it is possible to prove (SBF) by replacing statistics with an assumption of finite precision computation in the form of  Computational Blackbody Radiation. Radiative heat transfer is here seen to be geared as a deterministic threshold phenomenon like a semi-conductor allowing heat transfer only one-way from warm to cold. 

Another aspect of radiation is that it is impossible to completely turn off or block by shielding of some sort. It connects to the universality of blackbody radiation taking the same form independent of material matter as shown here

We are thus led to the following form of the 2nd Law without any statistics:
  • Radiative heat transfer from warm to cold is unstoppable and irreversible. 
The finite precision aspect here takes the form of a threshold, thus different from that operational in the case of turbulent dissipation into heat energy connecting to complexity with sharp gradients as discussed in recent posts.

PS To learn how statistical mechanics is taught at Stanford University by a world-leading physicist, listen to Lecture 1 and ask yourself if you get illuminated:
  • Statistical mechanics is useful for predictions in cases when you do not know the initial conditions nor the laws of physics.

2nd Law for Cosmology

A mathematical model of the Universe can take the form of Euler's equations for a gas supplemented with Newton's law of gravitation as stated in Chap 32 Cosmology of Computational Thermodynamics.  

Computational solutions of these equations satisfy the following evolution equations as laws of thermodynamics depending on time $t$ 

  • $\dot K(t)=W(t)-D(t)-\dot\Phi (t)$     (1)
  • $\dot E(t)=-W(t)+D(t)$,                  (2)
where $K(t)$ is total kinetic energy, $E(t)$ total internal energy (heat energy), $W(t)$ is total work, $D(t)\ge 0$ is total turbulent dissipation, $\Phi (t)$ is total gravitational energy and the dot signifies differentiation with respect to time. Adding (1) and (2) gives the following total energy balance:
  • $K(t)+E(t)-\Phi(t)= constant.$          (3)
Further (1) and (2) express an irreversible transfer of energy from kinetic to internal energy with $D(t)>0$, and so serve as a 2nd Law for Cosmology giving time a direction. Recall that the theoretical challenge is to tell/show why turbulent dissipation is unavoidable. 

Computations may start from a hot dense state at $t=0$ which is seen to expand/cool (run code) (Big Bang) to maximal size and then contract/warm back to a hot dense state (Big Crunch) (run code) in an irreversible sequence of expansions/contractions until some final stationary equilibrium state with $E(\infty )=P(\infty )$. Compare with post from 2011.


Dark Matter as Axions as 85% of All Matter?

Sabine Hossenfelder in Exploding stars made of dark matter could heat up universe informs us about some new speculations about the physics of dark matter, believed to make up 85% of all matter in the Universe, in the form of    

  • axions or axion particles 
able to form 
  • axion stars
able to explode and so able to  
  • heat surrounding gas 
which could be a detectable phenomenon. Sabine ends asking how it is possible that physicists can be paid for this kind of speculation. 

Compare with the idea I have suggested that matter with density $\rho (x,t)=\Delta \phi (x,t)$ is formed from a gravitational potential $\phi (x,t)$ locally in space-time with coordinates $(x,t)$ from differentiation expressed by the Laplacian differential operator $\Delta$, and that dark matter corresponds to large regions where the potential is smooth in the sense that $\Delta \phi (x,t)$ is not large enough to create matter which is visible. 

It is conceivable that such large regions could concentrate gravitationally and even form stars which could explode as in the above scenario. Is anyone willing to pay for this idea? Does it make sense? 

söndag 10 mars 2024

Three Elephants in Modern Physics

Modern theoretical physicists busy at work handling the crisis.

There are three elephants in the crisis room of modern physics:

  1. special relativity
  2. 2nd law of thermodynamics
  3. foundations of quantum mechanics 

which since many years are no longer discussed in the physics community, not because they have since long been resolved, but because they remain open problems with no progress for 100 years. 

Only crackpot amateur physicists still debate these problems on fringe sites like John Chappell Natural Philosophy Society. Accordingly these topics are longer part of core theoretical physics education, since questions from students cannot be answered. This may seem strange, but it has come to be an agreement within the physics community to live with. Discussion closed. Research submitted to leading journals on these topics will be rejected, without refereeing.

Is it possible to understand why no progress has been made? Why is there a crisis?

As concerns special relativity the reason is that it is not a theory about real physics, but a theory about perceived observer perceptions of "events" identical to space-time coordinates without physical meaning. This makes special relativity to a game following some ad hoc rules without clear physics pretending to describe a world, which shows to be very very strange. Unless of course you realise that it is just a game and not science, but then nothing to teach at a university.  

As concerns topics 2 and 3 the reason is the introduction of statistical mechanics by Boltzmann followed by Planck as last rescue when deterministic physics seemed to fail. Again the trouble is that statistical mechanics is physics in the eyes of observers as probabilities of collections of physical events without cause-effect, rather than specific real events with cause-effect. Like special relativity this makes statistical mechanics into a game according to some ad hoc rules without physical meaning. 

In all three cases the observer is given a new key role as if the world depends on observation and cannot as in classical deterministic physics, itself go ahead without. This makes discussion complicated since there is no longer any common ground to all observers, and so eventually discussion dies. Nothing more to say. Everybody agrees that that there is nothing to disagree about. Everything in order. 

Schrödinger as the inventor of the Schrödinger equation for the Hydrogen atom with one electron as a deterministic classical continuum mechanical model, was appalled by Born's statistical interpretation of the multi-dimensional generalisation of his equation to atoms with several electrons defying physical meaning, and so gave up and turned to more fruitful pastures like the physics of living organisms.

But the elephants are there even if you pretend that they are not, and that is not a very healthy climate for scientific progress. You find my attempts to help out on this blog.  
  

lördag 9 mars 2024

Challenges to the 2nd Law

The book Challenges to the Second Law of Thermodynamics by Capek and Sheenan starts out describing the status of this most fundamental law of physics as of 2005:

  • For more than a century this field has lain fallow and beyond the pale of legitimate scientific inquiry due both to a dearth of scientific results and to a surfeit of peer pressure against such inquiry. 
  • It is remarkable that 20th century physics, which embraced several radical paradigm shifts, was unwilling to wrestle with this remnant of 19th century physics, whose foundations were admittedly suspect and largely unmodified by the discoveries of the succeeding century. 
  • This failure is due in part to the many strong imprimaturs placed on it by prominent scientists like Planck, Eddington, and Einstein. There grew around the second law a nearly inpenetrable mystique which only now is being pierced.
The book then continues to present 21 formulations of the 2nd Law followed by 20 versions of entropy and then proceeds to a large collection of challenges, which are all refuted, starting with this background:
  • The 2nd Law has no general theoretical proof.
  • Except perhaps for a dilute gas (Boltzmann's statistical mechanics), its absolute status rests squarely on empirical evidence.  
We learn that modern physics when confronted with the main unresolved problem of classical physics reacted by denial and oppression as cover up of a failure of monumental dimension. The roots of the present crisis of modern physics may hide here. 

Computational Thermodynamics seeks to demystify the 2nd Law as a result of finite precise computation  meeting systems developing increasing complexity like turbulence in slightly viscous flow.  

Physicists confronted with proving the 2nd Law. 



fredag 8 mars 2024

2nd Law vs Friction

Perpetual motion is viewed to be impossible because according to the 2nd Law of Thermodynamics (see recent posts)

  • There is always some friction.     (F)
Is this true? It does not appear to be true in the microscopics of atoms in stable ground state, where apparently electrons move around (if they do) without losing energy seemingly without friction, see Real Quantum Mechanics

But is it true in macroscopics of many atoms or molecules such as that of fluids? A viscous fluid in motion meets a flat plate boundary with a skin friction depending on the Reynolds number $Re\sim \frac{1}{\nu}$ with $\nu$ viscosity as follows based on observation:


We see that skin friction coefficient $c_f$ in a laminar boundary layer tends to zero as $Re$ tends to infinity (or viscosity tends to zero), which is supported by basic mathematical analysis. 

We also that there is a transition from laminar to turbulent boundary layer for $Re > 5\times 10^5$ with larger turbulent skin friction coefficient $c_f\approx 0.002$ with a very slow decay with increasing $Re$ with no observations of $c_f<0.001$. 

The transition to a turbulent boundary layer is the result of inherent instability of the motion of a fluid with small viscosity, which is supported by mathematical analysis, an instability which cannot be controlled in real life, see Computational Turbulent Incompressible Flow.

We thus get the message from both observation and mathematical analysis that (F) as concerns skin friction is true: Skin friction in a real fluid does not tend to zero with increasing $Re$, because there is always transition to a turbulent boundary layer with $c_f>0.001$.  

Laminar skin friction is vanishing in the limit of infinite $Re$, but not turbulent skin friction.

We can thus can rationalise the 2nd Law for fluids as presence of unavoidable skin friction from turbulent motion resulting from uncontrollable instability, which does not tend to zero with increasing $Re$ , reflecting presence of a non-zero limit of turbulent dissipation in the spirit of Kolmogorov.  

Connecting to finite precision computation discussed in recent posts, we understand that computational resolution to physical scale is not necessary, which makes turbulent flow computable without turbulence modelling. 

In bluff body computations (beyond drag crisis) it is possible to effectively set skin friction to zero as a slip boundary condition thus avoiding having to resolve turbulent boundary layers. In pipe flow skin friction cannot be set to zero.

PS1 A hope of vanishingly small laminar skin friction has been nurtured in the fluid mechanics community, but the required instability control has shown to be impossible, as an expression of the 2nd Law. 

PS2 One may ask if motion without friction is possible at all? Here we face the question what motion is, connecting to wave motion as an illusion. In Newtonian gravitation of a Platonic ideal world there is no dissipation/friction, but in the real world there is: The Moon is slowly receding from the Earth because of tidal motion energy loss. What then about the supposed motion of photons at the speed of light seemingly without energy loss from friction? Is this motion illusion, with necessary energy loss to receive a light signal? Computational Blackbody Radiationֶ  says yes! The 2nd Law can thus be formulated: 
  • There is no motion without friction. Photons in frictionless motion is illusion.
Do you buy this argument? Can you explain why? Instability? What is a photon in motion without friction? Illusion?

Why is there always some friction?


onsdag 6 mars 2024

2nd Law for Radiative Heat Transfer as Finite Precision Physics

Transfer of heat energy from warm to cold by electromagnetic waves.

This is a continuation of recent posts on the 2nd Law of Thermodynamics.

There is a 2nd Law for radiative heat transfer expressing:  

  • Heat energy is transferred by electromagnetic waves from a body with higher temperature to a body with lower temperature, not the other way.  (*) 
Why is that? Standard physics states that it is a consequence of Plank's law of radiation based on statistics of energy quanta, as an analog of Boltzmann's proof of a 2nd Law based on statistical mechanics. The objections raised to Boltzmann's proof carry over to that of  Planck, who was very unhappy with his proof but not as unhappy as Boltzmann with his. 

An approach without statistics is presented on Computational Blackbody Radiation where (*) appears as a high frequency cut-off increasing with temperature. The effect is that only frequencies above cut-off for the body with lower temperature have a heating effect resulting in one-way transfer of heat from warm to cold. For more details check-out this presentation. 

The high-frequency cut-off can be seen as an expression of finite precision increasing with temperature of atomic oscillation as heat energy. One-way heat transfer is thus a threshold phenomenon connected to finite precision.

Similarly, the photoelectric effect can be explained as a threshold phenomenon connected to finite precision, where only light of sufficiently high frequency can produce electrons. 

A 2nd Law based on finite precision physics thus can serve a role both in both fluid mechanics, and electromagnetics,  and also quantum mechanics as discussed in this post.  

In other words, finite precision physics in analog or digital form appears as the crucial aspect giving  meaning to a universal 2nd Law, which is missing in standard physics with infinite precision. 

The general idea is to replace statistical physics, which is not real physics, by finite precision computation, which can be both analog and digital physics. 

Of course, this idea will not be embraced by analytical mathematicians or theoretical physicists working with infinite precision...

The 2nd Law in a World of Finite Precision

Let there be a World of Finite Precision.

Here is a summary of aspects of the 2nd Law of Thermodynamics discussed in recent posts: 

  • 2nd Law gives an arrow of time or direction of time. 
  • A dissipative system satisfies a 2nd Law.
  • A dissipative system contains a diffusion mechanism decreasing sharp gradients by averaging. 
  • Averaging is irreversible since an average does not display how it was formed. 
  • Averaging/diffusion destroys ordered structure/information irreversibly. 
  • Key example: Destruction of large scale ordered kinetic energy into small scale unordered kinetic energy as heat energy in turbulent viscous dissipation.    
To describe the World, it is not sufficient to describe dissipative destruction, since also processes of construction are present. These are processes of emergence where structures like waves and vortices with velocity gradients are formed in fluids, solid ordered structures are formed by crystallisation and living organisms develop. 

The World then appears as combat between anabolism as building of ordered structure and metabolism as destruction of ordered structure into unordered heat energy. 

The 2nd Law states that destruction cannot be avoided. Perpetual motion is impossible. There will always be some friction/viscosity/averaging present which makes real physical processes irreversible with an arrow of time. 

The key question is now why some form of friction/viscosity/averaging cannot be avoided? There is no good answer in classical mathematical physics, because it assumes infinite precision and with infinite precision there is no need to form averages since all details can be kept. In other words, in a World of Infinite Precision there would be no 2nd Law stating unavoidable irreversibility, but its existence would not be guaranteed.  

But the World appears to exist and then satisfy a 2nd Law and so we are led to an idea of an Analog World of Finite Precision, which possible can be mimicked by a Digital World of Finite Precision (while a possibly non-existing World of infinite precision cannot). 

The Navier-Stokes equation for a fluid/gas with positive viscosity as well as Boltzmann's equations for a dilute gas are dissipative systems satisfying a 2nd Law with positive dissipation. But why positive viscosity? Why positive dissipation?

The Euler equations describe a fluid with zero viscosity, which formally in infinite precision is a system without dissipation violating the 2nd Law.  

We are led to consider the Euler equations in Finite Precision, which we approach by digital computation to find that computational solutions are turbulent with positive turbulent dissipation independent of mesh size/precision once sufficiently small. We understand that the presence of viscosity/dissipation is the result of a necessary averaging to avoid the flow to blow-up from increasing large velocity gradients emerging form convection mixing high and low speed flow. 

We thus explain the emergence of positive viscosity in a system with formally zero viscosity as a necessary mechanism to allow the system to continue to exist in time. 

The 2nd Law thus appears as being a mathematical necessity in an existing World of Finite Precision.   

The mathematical details of this scenario in the setting of Euler's equations id described in the books Computational Turbulent Incompressible FlowComputational Thermodynamics and Euler Right.