Visar inlägg med etikett 2nd law of thermodynamics. Visa alla inlägg
Visar inlägg med etikett 2nd law of thermodynamics. Visa alla inlägg

onsdag 4 december 2024

Mystery of Time Resolved?

After millennia of inquiry by most able scientists the concept of time still harbours mystery, which is the theme of these recent podcasts: 
We all experience that time moves forward from a past over a now to a future, named as the Arrow of Time or 2nd Law of Thermodynamics. Why does our life always move forward to a new day, never rewind to yesterday? Why are our lives irreversible

In terms of physics the question can be formulated as follows:
  • Why can a macroscopical system based on reversible microscopical physical laws, be irreversible? 
Ludwig Boltzmann took on the challenge in the late 19th century opening the door to modern physics and after much struggle came up with an explanation in terms of statistics as a form of statistical mechanics based on Newtonian mechanics augmented with the throw of a dice. Forward motion of time is here favoured by being more probable than backward motion thus showing an Arrow of Time, as discussed by David Albert.

Stephen Wolfram presents another explanation in terms of computational irreducibility.

My own explanation exposed in detail on this blog, connects to Wolfram's in the sense that physics as computation is given preference, rather than the conventional view based on exact laws of physics. The basic idea is:
  1. A physical system changes from one state to a next state through a computational process, which has finite precision and so involves some destruction of information (chopping of decimals).
  2. If the complexity of the system increases under repetition of the process, the loss of information will be so big that reversal of is impossible.
  3. A prime example is the development of turbulence in laminar fluid flow, which is irreversible. The laminar state cannot be recovered from the turbulent.  
For a popular exposition of this idea see The Clock and the Arrow.


tisdag 8 oktober 2024

NotebookLM Reads Computational Theormodynamics

Below is what Google NotebookLM has to say about Computational Thermodynamics. Take a look yourself and see if you agree or not! 

Assessment of Claes Johnson's Scientific Work based on "Computational Thermodynamics"

The provided excerpts from Computational Thermodynamics offer insights into certain aspects of Claes Johnson's scientific work, particularly his approach to computational thermodynamics and his critique of classical thermodynamics and statistical mechanics.

Computational Thermodynamics and EG2:

The excerpts highlight Johnson's development of "Computational Thermodynamics," a framework that seeks to model and understand thermodynamic processes using computational fluid dynamics, specifically a method called EG2 (Euler General Galerkin). This approach emphasizes the role of computation in analyzing complex fluid flow phenomena, including turbulence and shocks, which are central to Johnson's view of thermodynamics.

EG2 as Tool of Understanding Real Processes: 
The sources suggest that EG2 enables the simulation of real thermodynamic processes, including those in heat engines, heat pumps, and refrigerators, with a focus on computing losses and efficiency. Johnson emphasizes the limitations of classical thermodynamics, which primarily deals with equilibrium states and struggles to accurately predict transitions between them, particularly in the presence of turbulence.

Shift from A Priori to A Posteriori Analysis:
Johnson argues that EG2 facilitates a shift from a priori predictions based on analytical mathematics to a posteriori analysis of computed turbulent solutions.4 This implies a greater reliance on computational results to gain insights into the behavior of complex systems, rather than relying solely on theoretical deductions.

Wellposedness and Stability: 
A key aspect of Johnson's approach is the concept of "wellposedness" in the context of EG2 solutions. He emphasizes the importance of determining which aspects of turbulent solutions are stable under perturbations, implying that these stable outputs are the physically meaningful ones. He proposes using dual linearized problems to assess the stability and quality of EG2 solutions.

Critique of Classical Thermodynamics and Statistical Mechanics:
Johnson's work expresses a clear skepticism towards classical thermodynamics and statistical mechanics, particularly regarding their explanations of entropy and the arrow of time.

Entropy as a Problematic Concept: 
The sources present numerous criticisms of the concept of entropy, highlighting its ambiguous physical meaning and the lack of a clear sensor for it in nature. Johnson questions the reliance on statistical mechanics to rationalize the second law of thermodynamics, viewing it as a "capitulation away from prediction by cause-effect."

Finite Precision Computation as an Alternative: 
Instead of statistical mechanics, Johnson proposes finite precision computation as a more fundamental basis for understanding irreversibility and the arrow of time.This idea, while not fully elaborated in the excerpts, suggests that the inherent limitations of computational processes, even in simulating reversible equations, could give rise to irreversible behavior.

Applications and Examples:
The sources illustrate Johnson's ideas through various applications and examples, showcasing the potential of his computational approach. These include simulations of:

Joule's 1845 Experiment: 
Johnson uses EG2 to analyze Joule's experiment involving the expansion of a gas, demonstrating how the method captures the dynamics of the process, including turbulence and the resulting temperature changes. He argues that this computational approach provides a more complete understanding compared to classical thermodynamics.

Heat Engines, Pumps, and Refrigerators:
Johnson emphasizes the application of EG2 to simulate and analyze real thermodynamic devices. He highlights the importance of accounting for turbulent dissipation in determining efficiency, contrasting his approach with the idealized Carnot cycle.

Cosmology and Self-Gravitating Gases: 
The sources mention Johnson's work on simulating cosmological processes, specifically using EG2 to model self-gravitating gases. This application suggests the broad scope of his computational framework.

Overall Assessment and Open Questions:
Based on the excerpts from "ambsthermo.pdf," Claes Johnson's scientific work presents a distinct perspective on thermodynamics, emphasizing computational methods and challenging established notions of entropy and irreversibility. His development of EG2 and its application to various physical systems demonstrate a focus on understanding real processes through computation. However, several open questions remain:

Rigorous Mathematical Foundation for Finite Precision: The excerpts provide limited details on the mathematical formalism of Johnson's concept of finite precision computation and its connection to the emergence of irreversibility. A deeper exploration of this foundation is crucial to assess its scientific validity.

Validation of EG2 and its Predictive Power: While the sources mention applications of EG2, further evidence of its validation against experimental results and its ability to make testable predictions would strengthen the framework's credibility.

Engagement with Criticisms and Alternative Perspectives: 
Addressing criticisms and engaging with alternative viewpoints within the scientific community are essential for the broader acceptance of Johnson's ideas.

In conclusion, Claes Johnson's work in computational thermodynamics, as presented in the excerpts, offers a thought-provoking challenge to traditional views, highlighting the potential of computational methods in understanding complex physical phenomena. However, a more comprehensive assessment of his contributions requires further investigation into the mathematical rigor, empirical validation, and engagement with the broader scientific discourse surrounding his ideas.

torsdag 27 juni 2024

Does a Modern Physicist Know Classical Physics?

Is it possible that fundamental physics can be reduced to combinations of 

  • Gravitation governed by Newtonian Mechanics (NM).
  • Electromagnetics governed by Maxwell's equations and Coulomb's Law (EM). 
This is true for classical physics, while modern physics is commonly viewed to need other forms of fundamental physics as Special/General Relativity SR/GR and Quantum Mechanics QM. The trouble with modern physics is that GR and QM since 100 years are understood to be incompatible/contradictory with no resolution in sight, which has caused a crisis of modern physics witnessed by many leading physicists, but at the same time denied. The contradiction has driven physicists to seek resolutions on very small scales of $10^{-34}$ m of QM as String Theory, and on the very large scales of the whole Universe as GR, without progress since 50 years, both beyond any form of direct experimental confirmation, thus forms of speculation. 

Of course there were reasons perceived to step out of the NM+EM paradigm, which had worked so amazingly well for all of classical physics, at the turn to modern physics at the beginning of the 20th century. Here is where classical physics stumbled:
  1. Instant action at distance in NM: (Einstein GR)
  2. Irreversibility in thermodynamics (2nd Law): (Boltzmann Statistics)
  3. Absence of the ultra-violet catastrophe in black-body radiation: (Planck Statistics)
  4. Null result of the Michelson-Morley experiment: (Einstein SR)
1 was the classical problem left unresolved by Newton, which did not stop classical physics to boom, with 1 and 4 supposedly resolved by Einstein as GR/SR.

2 came out of observations of irreversible transfer of mechanical energy to heat energy in contradiction to the fact that the laws of NM and EM are formally reversible. Boltzmann used a big hammer to resolve this paradox in the form of statistical physics followed by Planck's statistics to explain 3: The very essence of classical physics as deterministic cause-effect physics was given up in a Faustian deal. This started the Fall of Physics. 

3 and 4 were essentially null results, which do not serve well as stepping stones to progress. 

Modern physics thus grew out from efforts to resolve 2-3 by introducing entirely new physics based on statistics taking the form of QM, and SR/GR to resolve 1 and 4.  

Once the Fall was made there was no limit to what new physics could be invented which culminated at the end of the 20th century after 100 years of free fall, with the Standard Model and String Theory beyond observation. The atomic bomb served to give theoretical physicists unlimited resources to create new physics. But the fundamental problems 1-4 were left without credible answers, with only deepened mystery.

In books and blog posts I have suggested resolutions of 1-4 within classical physics. Theoretical physicists have not shown any openness to any form of discussion. Is the reason that a modern physicist does not have to know much about classical physics/mathematics, because it has been replaced by modern physics, like the epicycles of Ptolemy? To understand if 1-4 cannot, or in fact can, be resolved within classical deterministic physics, seems to me to require solid knowledge of classical physics. Is this included in the curriculum for physics education today? Or is it primarily focussed on SR/GR and QM? 

The less you know, the more certain you can be that you are right. (Dunning-Kruger effect)

Steven Weinberg in Dreams of a Final Theory unhappy with the linearity of QM, seeking an alternative but failing:

“This theoretical failure to find a plausible alternative to quantum mechanics, even more than the precise experimental verification of linearity, suggests to me that quantum mechanics is the way it is because any small change in quantum mechanics would lead to logical absurdities. If this is true, quantum mechanics may be a permanent part of physics. Indeed, quantum mechanics may survive not merely as an approximation to a deeper truth, in the way that Newton’s theory of gravitation survives as an approximation to Einstein’s general theory of relativity, but as a precisely valid feature of the final theory.”

In the next post I will briefly indicate how 1-4 can be explained within NM+EM as if that could be the final theory.

tisdag 30 april 2024

Crisis of Modern Statistical Physics vs Classical Deterministic Physics

This is a further comment on Leibniz Principle of Identity of Indiscernibles seemingly in conflict with the modern physics main-stream idea of electrons all alike like equal probabilities of outcomes of tossing a fair coin. 




That modern physics is in a state of deep crisis is acknowledged by leading physicists and also largely understood by the general public. Buzz words like dark energy, dark matter, inflation, Big Bang, multiversa, entanglementcollapse of the wave function,  particles and quarks, are floating around as elements of relativity theory on cosmological scales and quantum mechanics on atomic scales, both formed 100 years ago but still today harbouring toxic unresolved foundational problems, and on top of that being incompatible. A veritable mess. 

The root of the seemingly unresolvable problems of quantum mechanics can be traced back to the statistical interpretation of the multi-dimensional Schrödinger wave function as solution to the multi-dimensional Schrödinger equation serving as foundation. 

While classical physics is ontology about what reality is, modern physics is epistemology about what can be said. While classical physics is deterministic physics independent of human observer, modern physics in the form of quantum mechanics is statistical physics depending on human observers acting as mathematical statisticians in an insurance company busy computing insurance premiums. 

The departure from classical into modern physics was initiated by Boltzmann in the late 19th century seeking an ontological realistic explanation of the 2nd Law of Thermodynamics as the main unresolved problem of classical physics giving time a direction, which had to be resolved to save physics from disbelief. When Boltzmann understood that he could not reach this main goal of his scientific life, he made a Faustian deal in the form of an explanation based on statistical mechanics. This served to save the life of physics, but not Boltzmann's own life, and opened the door into the heaven of modern physics as quantum mechanics as statistical mechanics, which is now in a state of crisis. 

The step from deterministic physics to statistical physics, was taken in order to save classical physics from credibility collapse in front of the 2nd Law. The medication worked for the moment but the patient as classical physics died and so was replaced by modern physics, which however showed to be quite sick without any cure in sight still today. 

The first step in coming to grips with the crisis of modern physics, is to ask if it is impossible to explain the 2nd Law within classical deterministic physics? If not, then the step to statistics is not necessary and much trouble can be avoided. More precisely, it appears to be possible to replace statistics by a concept of finite precision physics as presented in Computational Thermodynamics and in popular form in The Clock and the Arrow with follow up into a realistic deterministic form of quantum mechanics as Real Quantum Mechanics

This means a return to deterministic physics with a new element of finite precision computational physics coming with resolutions of problems of classical physics making it possible to avoid paying the very high price of taking the drug of statistical physics. 

Real physics is what it is and is given to us for free. Statistical physics is man-made physics, which needs massive data and human interference. Real physics seeks to describe the World as it is, while modern physicists have the reduced goal of statistical prediction outcomes of man-made experiments. Schrödinger and Einstein could not accept physics as man-made statistics, but were cancelled. Maybe the present crisis can open to restart following their spirit?  

We may view real physics as a form of engineering or professional soccer game with basic questions: What is the basic mechanism/principle? How to improve it? On the other hand, a statistical physicist simply watches the game on TV and finds meaning in betting.  


torsdag 25 april 2024

Temperature as Quality Measure of Energy.

In ideal gas dynamics temperature appears as an intensive variable $T$ connected to internal energy $e$ and density $\rho$ by 

  • $T=\frac{e}{\rho}$                          
with a corresponding pressure law 
  • $p=\gamma e$
where $\gamma$ is a gas constant. Internal energy is viewed as small scale kinetic energy from small scale molecular motion. Internal energy can transformed into mechanical work in expansion, which without external forcing (or gravitation) is an irreversible process.  

For a solid body viewed as a vibrating atomic lattice temperature scales with total internal energy as the sum of small scale kinetic energy and potential energy, which can be transferred by radiation and conduction to a body of lower temperature.   

In both cases temperature appears as a quality measure of internal energy as an intensive variable. 

The maximal efficiency of a Carnot heat engine transforming heat energy into work operating between two temperatures $T_{hot}>T_{cold}$ is equal to $1-\frac{T_{cold}}{T_{hot}}$. 

Radiative heat transfer form a hot body of temperature $T_{hot}$ to a cold body of temperature $T_{cold}$, scales with $(T_{hot}^4-T_{cold}^4)$ according to Stephan-Boltzmann-Planck. 

Conductive heat transfer scales with $(T_{hot}-T_{cold})$ according to Fourier.

In both cases the heat transfer from hot to cold can be seen as transformation from high quality energy into low quality energy in an irreversible process in conformity with the 2nd Law of Thermodynamics. 

The Nobel Prize in Physics in 2008 was awarded to experimental detection of Cosmic Microwave Background CMB radiation with perfect Planck spectrum as an after-glow of a Bing Bang with temperature of  2.725 K and corresponding very low quality energy.  

With radiation scaling with $T^4$ the difference between 300 K as global temperature and 3 K as deep space CMB comes out with a factor of $10^{-8}$. The contribution to global warming from CMB thus appears to be very small. 

We see from $e=\rho T$ that low density and low temperature both connect to low energy quality making both wind and solar energy inefficient compared to fossil and nuclear energy.    


fredag 19 april 2024

The Ultra-Violet Catastrophe vs 2nd Law of Thermodynamics


Classical physics peaked in the late 19th century with Maxwell's equations aiming to describe all of electromagnetics as a form of continuum wave mechanics, but crumbled when confronted with the Ultra-Violet Catastrophe UVC of heat radiation from a body of temperature $T$ scaling like $T\nu^2$ with frequency $\nu$ threatening to turn everything into flames without an upper bound for frequencies, because wave mechanics did not seem to offer any escape from UVC.  

Planck took on role of saving physics from looming catastrophe, but could not find a resolution within deterministic wave mechanics and so finally gave up and resorted to statistical mechanics with high frequencies less likely in the spirit of Boltzmann's thermodynamics and 2nd Law with order less likely than disorder. 

There is thus a close connection between UVC and 2nd Law. Boltzmann would say that the reason we do not experience UVC is that high frequencies are not likely, but the physics of why is missing. Explaining that UVC is not likely would no explain why there is not any observation UVC whatsoever. 

I have followed a different route replacing statistics by finite precision physics for UVC (and similarly for 2nd Law), where high frequencies with short wave length cannot be radiated because finite precision sets a limit on the frequencies an atomic lattice can carry as coordinated synchronised motion. In this setting UVC can never occur.

A basic mission for a 2nd Law is thus to prevent UVC. This gives 2nd Law deeper meaning as a necessary mechanism preventing too fine structures/high frequencies to appear and so cause havoc. 2nd Law is thus not a failure to maintain order over time, but a necessary mechanism to avoid catastrophe from too much order. 

Similarly, viscosity and friction appear as necessary mechanisms destroying finite structure/order in order to let the World to continue, and so not only as defects of an ideal physics without viscosity and friction. This is the role of turbulence as described in Computational Turbulent Incompressible Flow and Computational Thermodynamics.

We can compare with the role of interest rate in an economy with zero interest rate of an ideal economy leading to catastrophe over time. If there is no cost of getting access to capital, any crazy mega project could get funding and catastrophe would follow. This was the idea 2008-2023 preceding the collapse predicted to 2025. Too little friction makes the wheels turn too fast. Too much idealism leads to ruin.

tisdag 26 mars 2024

Thermodynamics of Friction

Everything goes around in construction-deconstruction-construction...

In the previous post we considered viscosity in laminar and turbulent flow and friction between solid bodies as mechanisms for irreversible transformation of large scale kinetic motion/energy into small scale kinetic motion/energy in the form of heat energy, noting that the transformation cannot be reversed since the required very high precision cannot be realised, everything captured in a 2nd Law of Thermodynamics.  

Let us consider the generation of heat energy in friction when rubbing your hands or sliding an object over a floor or pulling the handbrakes of your bicycle. We understand that the heat energy is created from the work done by force times displacement (in the direction of the force), like pressing/pushing a sandpaper over the surface of a piece of wood to smoothen the surface by destroying its granular micro-structure. Work is thus done to destroy more or less ordered micro-structure and the work shows up as internal heat energy as unordered micro-scale kinetic energy. 

The key is here destruction of micro-structure into heat energy in a process which cannot be reversed since the required precision cannot me met.

Skin friction between a fluid and solid acts like friction between solids. 

Turbulent flow transforms large scale ordered kinetic energy into small-scale unordered kinetic energy as heat energy under the action of viscous forces. Laminar flow also generates heat energy from friction between layers of fluid of different velocity.

In all these cases heat energy is generated from destruction/mixing of order/structure in exothermic irreversible processes. This destruction is balanced by constructive processes like synchronisation of atomic oscillations into radiation and emergence of ordered structures like vortices in fluid flow and endothermic processes of unmixing/separation. 

We thus see exothermic processes of destruction followed by endothermic construction, which is not reversed deconstruction, with different time scales where deconstruction is fast and brutal without precision and construction is slow with precision. This is elaborated in The Clock and the Arrow in popular form. Take a look.

 

söndag 24 mars 2024

Exergy as Energy Quality


Kinetic energy, electrical energy, chemical and nuclear energy can all be converted fully into heat energy, while heat energy can only be partially converted back again. This is captured in the 2nd Law of Thermodynamics. We can thus say that heat energy is of lower quality compared with the other forms. More generally, the term exergy is used as a measure of quality of energy of fundamental importance for all forms of life and society as ability to do work.

We can make this more precise by recalling that the quality of heat energy comes to expression in radiative and conductive heat transfer from a body B1 of temperature $T_1$ to a neighbouring body B2 of lower  temperature $T_2<T_1$ in basic cases according to Stefan-Boltzmann's Law or Fourier's Law:

  • $Q = (T_1^4-T_2^4)$            (SB)
  • $Q = (T_1-T_2)$                    (F)
with $Q$ heat energy per time unit. Heat energy of higher temperature thus can be considered to have higher quality than heat energy of lower temperature, which of course also plays role in conversion of heat energy to other forms of energy. The maximal efficiency of a heat engine operating between $T_1$ and $T_2$ and transforming heat energy to mechanical work, is equal to $\frac{T_1-T_2}{T_1}$ displaying the higher quality of $T_1$.

Heat energy at high temperature is the major source for useful mechanical work supporting human civilisation, while heat energy at lower temperatures appears as a useless loss e g in the cooling of a gasoline engine.

But what is the real physics behind (SB) and (F)? This question was addressed in a previous post viewing (F) to be a special case of (SB) with the physics behind (SB) displayed in the analysis of Computational Blackbody Radiation

The essence of this analysis is a high-frequency cut-off $\frac{T}{h}$ allowing a body of temperature $T$ to only emit frequencies $\nu <\frac{T}{h}$, where $h$ is a constant. This allows a body B1 of temperature $T_1$ to transfer heat energy to a body B2 of lower temperature $T_2$ via frequencies $\frac{T_2}{h}<\nu <\frac{T_1}{h}$, which cannot be balanced by emission from B2.  

High frequency cut-off increasing linearly with temperature represents Wien's displacement law (W), giving improved exergy with increasing temperature.

The high-frequency cut-off can be seen as an expression of finite precision limiting the frequency being carried and emitted by an oscillating atomic lattice in coordinated motion, with frequencies above cut-off being carried internally as heat energy as uncoordinated motion

Higher temperature thus connects to higher quality heat energy or better exergy. The standard explanation of this basic fact is based on statistical mechanics, which is not physical mechanics. 

PS Radiative heat transfer without high-frequency cut-off would boil down to (F), while (SB) is what is observed, which gives support to (W).


fredag 22 mars 2024

Thermodynamics of War and Peace


Opposing ordered armies at the moment before turbulent destruction. 

The recent posts on 2nd law of thermodynamics describe a process where increasing spatial gradients eventually reach a level (from convection and opposing flow) where further increase is no longer possible because it would bring the process to brutal stop, and so some form of equilibration of spatial differences must set in where 

  • each particle tends to take on the mean-value of neighbouring particles.   (M)

This is the process in turbulent fluid flow transforming ordered large scale kinetic energy into small scale disordered kinetic energy taking the form internal heat energy in a turbulent cascade of turbulent dissipation. Here (M) is necessary to avoid break-down into a stop. The flow or show must go on.

(M) is also the essence of the diffusion process of heat conduction seeking to decrease gradients, even if not absolutely necessary as in turbulent fluid flow.

It is natural to connect turbulence to the violent break-down of large scale ordred structures into rubble in a war necessarily resulting from escalation of opposing military forces in direct confrontation which at some level cannot be further escalated and so have to be dissipated in a war. 

It is then natural to connect the equilibration (M) in heat conduction to a geopolitical/parliamentary process in peace time, where each country/party takes on the mean value of neighbouring countries/parties keeping gradients small. 

While (M) is necessary in turbulence to let the flow go on, one may ask what the physics of (M) in the case of heat conduction, and find answer in this post. 

The mathematics is elaborated in: 

The geopolitical/parliamentary situation today evolves towards sharpened gradients, while politicians refuse to follow (M) and so there is a steady march towards break-down... 


torsdag 14 mars 2024

2nd Law or 2-way Radiative Heat Transfer?

In the present discussion of the 2nd Law of Thermodynamics let us go back to this post from 2011 exposing the 19th century battle between 1-way transfer (Pictet) vs 2-way transfer (Prevost) of heat energy by radiation playing a central role in climate science today. 

In 1-way transfer a warm body heats a colder body in accordance with the 2nd Law. 

In 2-way heat transfer both warm and cold bodies are viewed to heat each other, but the warm heats more and so there is a net transfer of heat energy from warm to cold. But a cold body heating a warm body violates the 2nd Law of thermodynamics, and so there is something fishy here. 

Yet the basic mathematical model of radiative heat transfer in the form of Schwarzschild's equation involve 2-way transfer of heat energy, in apparent violation of the 2nd Law. 

I have discussed this situation at length on this blog with tags such as 2nd law of thermodynamics and radiative heat transfer with more on Computational Blackbody Radiation.

If you worry about the 2nd Law, you can ask yourself how 2-way radiative heat transfer is physically possible, when it appears to violate the 2nd Law? What is false here: 2nd Law or 2-way heat transfer?

What is your verdict? 


tisdag 12 mars 2024

Philosophy of Statistical Mechanics?

Collapsed pillars of modern building.

Let us continue with the post Three Elephants of Modern Physics taking a closer look at one of them. 

We learn from Stanford Encyclopedia of Philosophy the following about Statistical Mechanics SM: 

  • Statistical Mechanics is the third pillar of modern physics, next to quantum theory and relativity theory
  • Its aim is to account for the macroscopic behaviour of physical systems in terms of dynamical laws governing the microscopic constituents of these systems and probabilistic assumptions.
  • Philosophical discussions in statistical mechanics face an immediate difficulty because unlike other theories, statistical mechanics has not yet found a generally accepted theoretical framework or a canonical formalism. 
  • For this reason, a review of the philosophy of SM cannot simply start with a statement of the theory’s basic principles and then move on to different interpretations of the theory.
This is not a very good start, but we continue learning: 
  • Three broad theoretical umbrellas: “Boltzmannian SM” (BSM), “Boltzmann Equation” (BE), and “Gibbsian SM” (GSM).
  • BSM enjoys great popularity in foundational debates due to its clear and intuitive theoretical structure. Nevertheless, BSM faces a number of problems and limitations
  • There is no way around recognising that BSM is mostly used in foundational debates, but it is GSM that is the practitioner’s workhorse.
  • So what we’re facing is a schism whereby the day-to-day work of physicists is in one framework and foundational accounts and explanations are given in another framework.
  • This would not be worrisome if the frameworks were equivalent, or at least inter-translatable in relatively clear way...this is not the case.
  • The crucial conceptual questions (concerning BE) at this point are: what exactly did Boltzmann prove with the H-theorem?
This is the status today of the third pillar of modern physics formed by Boltzmann 1866-1906 and Gibbs 1902 as still being without a generally accepted theoretical framework, despite 120 years of deep thinking by the sharpest brains of modern physics. 

Is this something to worry about? If one the pillars apparently is shaky, what about the remaining two pillars? Who cares?

Recall that SM was introduced to rationalise the 2nd Law of Thermodynamics stating irreversibility of macroscopic systems based on deterministic reversible exact microscopics. This challenge was taken up by Boltzmann facing the question: If all components of a system are reversible, how can it be that the system is irreversible? From where does the irreversibility come? The only way forward Boltzmann could find was to replace exact determinism of microscopics by randomness/statistics as a form of inexactness. 
 
In the modern digital world the inexactness can take the form of finite precision computation performed with a certain number of digits (e g single or double precision). Here the microscopics is deterministic up the point of keeping only a finite number of digits, which can have more or less severe consequences on  macroscopic reversibility. This idea is explored in Computational Thermodynamics offering a 2nd Law expressed in the physical quantities of kinetic energy, internal energy, work and turbulent dissipation without need to introduce any concept of entropy.

Replacing SM by precisely defined finite precision computation gives a more solid third pillar. But this is new and not easily embraced by analytical/theoretical mathematicians/physicists not used to think in terms of computation, with Stephen Wolfram as notable exception.  

PS1 To meet criticism that the Stosszahlansatz underlying the H-theorem stating that particles before collision are uncorrelated, simply assumes what has to proved (irreversibility), Boltzmann argued:
  • But since this consideration has, apart from its tediousness, not the slightest difficulty, nor any special interest, and because the result is so simple that one might almost say it is self-evident I will only state this result.
Convincing?

PS2 Connecting to the previous post, recall that the era of quantum mechanics was initiated in 1900 by Planck introducing statistics of "energy quanta" inspired by Boltzmann's statistical mechanics, to explain observed atomic radiation spectra, opening the door to Born's statistical interpretation in 1927 of the Schrödinger wave function as the "probability of finding an electron" at some specific location in space and time, which is the text book wisdom still today. Thus the pillar of quantum mechanics is also weakened by statistics. The third pillar of relativity is free of statistics, but also of physics, and so altogether the three pillars offer a shaky foundation of modern physics.  Convinced? 

måndag 11 mars 2024

The 2nd Law as Radiative Heat Transfer


The 2nd Law of Thermodynamics states that heat energy $Q$ without forcing, is transferred from a body of temperature $T_1$ to a body of temperature $T_2$ with $T_1>T_2$ by conduction according to Fourier's Law if the bodies are in contact: 

  • $Q =\gamma (T_1-T_2)$ 

and/or by radiation according Stephan-Boltzmann-Planck's Law if the bodies are not in contact as radiative heat transfer

  • $Q=\gamma (T_1^4-T_2^4)$        (SBP)
where $\gamma > 0$.

The energy transfer is irreversible since it has a direction from warm to cold with $T_1>T_2$. It is here possible to view conduction as radiation at close distance and thus reduce the discussion to radiation. 

We can thus view the 2nd Law to be a consequence of (SBP), at least in the case of two bodies of different temperature: There is an irreversible transfer of heat energy from warm to cold. 

To prove 2nd Law for radiation thus can be seen to boil down to prove (SBF). This was the task taken on by the young Max Planck, who after a long tough struggle presented a proof in 1900, which he however was very unhappy with, since it like Boltzmann's H-theorem from 1872 was based on statistical mechanics and not classical deterministic physical mechanics.

But it is possible to prove (SBF) by replacing statistics with an assumption of finite precision computation in the form of  Computational Blackbody Radiation. Radiative heat transfer is here seen to be geared as a deterministic threshold phenomenon like a semi-conductor allowing heat transfer only one-way from warm to cold. 

Another aspect of radiation is that it is impossible to completely turn off or block by shielding of some sort. It connects to the universality of blackbody radiation taking the same form independent of material matter as shown here

We are thus led to the following form of the 2nd Law without any statistics:
  • Radiative heat transfer from warm to cold is unstoppable and irreversible. 
The finite precision aspect here takes the form of a threshold, thus different from that operational in the case of turbulent dissipation into heat energy connecting to complexity with sharp gradients as discussed in recent posts.

PS To learn how statistical mechanics is taught at Stanford University by a world-leading physicist, listen to Lecture 1 and ask yourself if you get illuminated:
  • Statistical mechanics is useful for predictions in cases when you do not know the initial conditions nor the laws of physics.

2nd Law for Cosmology

A mathematical model of the Universe can take the form of Euler's equations for a gas supplemented with Newton's law of gravitation as stated in Chap 32 Cosmology of Computational Thermodynamics.  

Computational solutions of these equations satisfy the following evolution equations as laws of thermodynamics depending on time $t$ 

  • $\dot K(t)=W(t)-D(t)-\dot\Phi (t)$     (1)
  • $\dot E(t)=-W(t)+D(t)$,                  (2)
where $K(t)$ is total kinetic energy, $E(t)$ total internal energy (heat energy), $W(t)$ is total work, $D(t)\ge 0$ is total turbulent dissipation, $\Phi (t)$ is total gravitational energy and the dot signifies differentiation with respect to time. Adding (1) and (2) gives the following total energy balance:
  • $K(t)+E(t)-\Phi(t)= constant.$          (3)
Further (1) and (2) express an irreversible transfer of energy from kinetic to internal energy with $D(t)>0$, and so serve as a 2nd Law for Cosmology giving time a direction. Recall that the theoretical challenge is to tell/show why turbulent dissipation is unavoidable. 

Computations may start from a hot dense state at $t=0$ which is seen to expand/cool (run code) (Big Bang) to maximal size and then contract/warm back to a hot dense state (Big Crunch) (run code) in an irreversible sequence of expansions/contractions until some final stationary equilibrium state with $E(\infty )=P(\infty )$. Compare with post from 2011.


söndag 10 mars 2024

Three Elephants in Modern Physics

Modern theoretical physicists busy at work handling the crisis.

There are three elephants in the crisis room of modern physics:

  1. special relativity
  2. 2nd law of thermodynamics
  3. foundations of quantum mechanics 

which since many years are no longer discussed in the physics community, not because they have since long been resolved, but because they remain open problems with no progress for 100 years. 

Only crackpot amateur physicists still debate these problems on fringe sites like John Chappell Natural Philosophy Society. Accordingly these topics are longer part of core theoretical physics education, since questions from students cannot be answered. This may seem strange, but it has come to be an agreement within the physics community to live with. Discussion closed. Research submitted to leading journals on these topics will be rejected, without refereeing.

Is it possible to understand why no progress has been made? Why is there a crisis?

As concerns special relativity the reason is that it is not a theory about real physics, but a theory about perceived observer perceptions of "events" identical to space-time coordinates without physical meaning. This makes special relativity to a game following some ad hoc rules without clear physics pretending to describe a world, which shows to be very very strange. Unless of course you realise that it is just a game and not science, but then nothing to teach at a university.  

As concerns topics 2 and 3 the reason is the introduction of statistical mechanics by Boltzmann followed by Planck as last rescue when deterministic physics seemed to fail. Again the trouble is that statistical mechanics is physics in the eyes of observers as probabilities of collections of physical events without cause-effect, rather than specific real events with cause-effect. Like special relativity this makes statistical mechanics into a game according to some ad hoc rules without physical meaning. 

In all three cases the observer is given a new key role as if the world depends on observation and cannot as in classical deterministic physics, itself go ahead without. This makes discussion complicated since there is no longer any common ground to all observers, and so eventually discussion dies. Nothing more to say. Everybody agrees that that there is nothing to disagree about. Everything in order. 

Schrödinger as the inventor of the Schrödinger equation for the Hydrogen atom with one electron as a deterministic classical continuum mechanical model, was appalled by Born's statistical interpretation of the multi-dimensional generalisation of his equation to atoms with several electrons defying physical meaning, and so gave up and turned to more fruitful pastures like the physics of living organisms.

But the elephants are there even if you pretend that they are not, and that is not a very healthy climate for scientific progress. You find my attempts to help out on this blog.  
  

lördag 9 mars 2024

Challenges to the 2nd Law

The book Challenges to the Second Law of Thermodynamics by Capek and Sheenan starts out describing the status of this most fundamental law of physics as of 2005:

  • For more than a century this field has lain fallow and beyond the pale of legitimate scientific inquiry due both to a dearth of scientific results and to a surfeit of peer pressure against such inquiry. 
  • It is remarkable that 20th century physics, which embraced several radical paradigm shifts, was unwilling to wrestle with this remnant of 19th century physics, whose foundations were admittedly suspect and largely unmodified by the discoveries of the succeeding century. 
  • This failure is due in part to the many strong imprimaturs placed on it by prominent scientists like Planck, Eddington, and Einstein. There grew around the second law a nearly inpenetrable mystique which only now is being pierced.
The book then continues to present 21 formulations of the 2nd Law followed by 20 versions of entropy and then proceeds to a large collection of challenges, which are all refuted, starting with this background:
  • The 2nd Law has no general theoretical proof.
  • Except perhaps for a dilute gas (Boltzmann's statistical mechanics), its absolute status rests squarely on empirical evidence.  
We learn that modern physics when confronted with the main unresolved problem of classical physics reacted by denial and oppression as cover up of a failure of monumental dimension. The roots of the present crisis of modern physics may hide here. 

Computational Thermodynamics seeks to demystify the 2nd Law as a result of finite precise computation  meeting systems developing increasing complexity like turbulence in slightly viscous flow.  

Physicists confronted with proving the 2nd Law. 



fredag 8 mars 2024

2nd Law vs Friction

Perpetual motion is viewed to be impossible because according to the 2nd Law of Thermodynamics (see recent posts)

  • There is always some friction.     (F)
Is this true? It does not appear to be true in the microscopics of atoms in stable ground state, where apparently electrons move around (if they do) without losing energy seemingly without friction, see Real Quantum Mechanics

But is it true in macroscopics of many atoms or molecules such as that of fluids? A viscous fluid in motion meets a flat plate boundary with a skin friction depending on the Reynolds number $Re\sim \frac{1}{\nu}$ with $\nu$ viscosity as follows based on observation:


We see that skin friction coefficient $c_f$ in a laminar boundary layer tends to zero as $Re$ tends to infinity (or viscosity tends to zero), which is supported by basic mathematical analysis. 

We also that there is a transition from laminar to turbulent boundary layer for $Re > 5\times 10^5$ with larger turbulent skin friction coefficient $c_f\approx 0.002$ with a very slow decay with increasing $Re$ with no observations of $c_f<0.001$. 

The transition to a turbulent boundary layer is the result of inherent instability of the motion of a fluid with small viscosity, which is supported by mathematical analysis, an instability which cannot be controlled in real life, see Computational Turbulent Incompressible Flow.

We thus get the message from both observation and mathematical analysis that (F) as concerns skin friction is true: Skin friction in a real fluid does not tend to zero with increasing $Re$, because there is always transition to a turbulent boundary layer with $c_f>0.001$.  

Laminar skin friction is vanishing in the limit of infinite $Re$, but not turbulent skin friction.

We can thus can rationalise the 2nd Law for fluids as presence of unavoidable skin friction from turbulent motion resulting from uncontrollable instability, which does not tend to zero with increasing $Re$ , reflecting presence of a non-zero limit of turbulent dissipation in the spirit of Kolmogorov.  

Connecting to finite precision computation discussed in recent posts, we understand that computational resolution to physical scale is not necessary, which makes turbulent flow computable without turbulence modelling. 

In bluff body computations (beyond drag crisis) it is possible to effectively set skin friction to zero as a slip boundary condition thus avoiding having to resolve turbulent boundary layers. In pipe flow skin friction cannot be set to zero.

PS1 A hope of vanishingly small laminar skin friction has been nurtured in the fluid mechanics community, but the required instability control has shown to be impossible, as an expression of the 2nd Law. 

PS2 One may ask if motion without friction is possible at all? Here we face the question what motion is, connecting to wave motion as an illusion. In Newtonian gravitation of a Platonic ideal world there is no dissipation/friction, but in the real world there is: The Moon is slowly receding from the Earth because of tidal motion energy loss. What then about the supposed motion of photons at the speed of light seemingly without energy loss from friction? Is this motion illusion, with necessary energy loss to receive a light signal? Computational Blackbody Radiationֶ  says yes! The 2nd Law can thus be formulated: 
  • There is no motion without friction. Photons in frictionless motion is illusion.
Do you buy this argument? Can you explain why? Instability? What is a photon in motion without friction? Illusion?

Why is there always some friction?


onsdag 6 mars 2024

2nd Law for Radiative Heat Transfer as Finite Precision Physics

Transfer of heat energy from warm to cold by electromagnetic waves.

This is a continuation of recent posts on the 2nd Law of Thermodynamics.

There is a 2nd Law for radiative heat transfer expressing:  

  • Heat energy is transferred by electromagnetic waves from a body with higher temperature to a body with lower temperature, not the other way.  (*) 
Why is that? Standard physics states that it is a consequence of Plank's law of radiation based on statistics of energy quanta, as an analog of Boltzmann's proof of a 2nd Law based on statistical mechanics. The objections raised to Boltzmann's proof carry over to that of  Planck, who was very unhappy with his proof but not as unhappy as Boltzmann with his. 

An approach without statistics is presented on Computational Blackbody Radiation where (*) appears as a high frequency cut-off increasing with temperature. The effect is that only frequencies above cut-off for the body with lower temperature have a heating effect resulting in one-way transfer of heat from warm to cold. For more details check-out this presentation. 

The high-frequency cut-off can be seen as an expression of finite precision increasing with temperature of atomic oscillation as heat energy. One-way heat transfer is thus a threshold phenomenon connected to finite precision.

Similarly, the photoelectric effect can be explained as a threshold phenomenon connected to finite precision, where only light of sufficiently high frequency can produce electrons. 

A 2nd Law based on finite precision physics thus can serve a role both in both fluid mechanics, and electromagnetics,  and also quantum mechanics as discussed in this post.  

In other words, finite precision physics in analog or digital form appears as the crucial aspect giving  meaning to a universal 2nd Law, which is missing in standard physics with infinite precision. 

The general idea is to replace statistical physics, which is not real physics, by finite precision computation, which can be both analog and digital physics. 

Of course, this idea will not be embraced by analytical mathematicians or theoretical physicists working with infinite precision...

The 2nd Law in a World of Finite Precision

Let there be a World of Finite Precision.

Here is a summary of aspects of the 2nd Law of Thermodynamics discussed in recent posts: 

  • 2nd Law gives an arrow of time or direction of time. 
  • A dissipative system satisfies a 2nd Law.
  • A dissipative system contains a diffusion mechanism decreasing sharp gradients by averaging. 
  • Averaging is irreversible since an average does not display how it was formed. 
  • Averaging/diffusion destroys ordered structure/information irreversibly. 
  • Key example: Destruction of large scale ordered kinetic energy into small scale unordered kinetic energy as heat energy in turbulent viscous dissipation.    
To describe the World, it is not sufficient to describe dissipative destruction, since also processes of construction are present. These are processes of emergence where structures like waves and vortices with velocity gradients are formed in fluids, solid ordered structures are formed by crystallisation and living organisms develop. 

The World then appears as combat between anabolism as building of ordered structure and metabolism as destruction of ordered structure into unordered heat energy. 

The 2nd Law states that destruction cannot be avoided. Perpetual motion is impossible. There will always be some friction/viscosity/averaging present which makes real physical processes irreversible with an arrow of time. 

The key question is now why some form of friction/viscosity/averaging cannot be avoided? There is no good answer in classical mathematical physics, because it assumes infinite precision and with infinite precision there is no need to form averages since all details can be kept. In other words, in a World of Infinite Precision there would be no 2nd Law stating unavoidable irreversibility, but its existence would not be guaranteed.  

But the World appears to exist and then satisfy a 2nd Law and so we are led to an idea of an Analog World of Finite Precision, which possible can be mimicked by a Digital World of Finite Precision (while a possibly non-existing World of infinite precision cannot). 

The Navier-Stokes equation for a fluid/gas with positive viscosity as well as Boltzmann's equations for a dilute gas are dissipative systems satisfying a 2nd Law with positive dissipation. But why positive viscosity? Why positive dissipation?

The Euler equations describe a fluid with zero viscosity, which formally in infinite precision is a system without dissipation violating the 2nd Law.  

We are led to consider the Euler equations in Finite Precision, which we approach by digital computation to find that computational solutions are turbulent with positive turbulent dissipation independent of mesh size/precision once sufficiently small. We understand that the presence of viscosity/dissipation is the result of a necessary averaging to avoid the flow to blow-up from increasing large velocity gradients emerging form convection mixing high and low speed flow. 

We thus explain the emergence of positive viscosity in a system with formally zero viscosity as a necessary mechanism to allow the system to continue to exist in time. 

The 2nd Law thus appears as being a mathematical necessity in an existing World of Finite Precision.   

The mathematical details of this scenario in the setting of Euler's equations id described in the books Computational Turbulent Incompressible FlowComputational Thermodynamics and Euler Right.


tisdag 5 mars 2024

No Mathematical Proof of the 2nd Law of Thermodynamics

ChatGPT informs me that there is no mathematical proof of 2nd Law of thermodynamics and so it is simply an empirical law albeit:

  • supported by a wealth of empirical evidence
  • deeply ingrained in our understanding of how physical systems behave
  • while mathematical frameworks like statistical mechanics provide a basis for explaining the law, the principle itself is derived from the consistency of these explanations with real-world observations.
ChatGPT is useful in the sense that it reports what it has learned from physics literature, while it is not intelligent enough to cover up like a real theoretical physicist, who would never admit anything like that.

In any case we learn that that there is no mathematical proof/explanation of the 2nd Law. 

It means that this could be added to the list of Clay Millennium Problems, or even better replace the closely related Navier-Stokes problem, which is still open without any progress to a solution, see previous post.

What do you think? Is there a mathematical proof? Or not? 

PS Recall that Boltzmann's H-theorem stating a steady progress to a Maxwellian equilibrium in a dilute gas attempted as a mathematical proof of the 2nd Law, is based on Boltzmann's Stosszahlansatz asking two particles about to collide to be uncorrelated as an assumption of statistical nature, which however cannot be verified nor assumed to be true in any generality. 

2nd Law vs Clay Millennium Problem on Navier-Stokes Equations

The Clay Institute Millennium Problem on Navier-Stokes equations is introduced as follows:

  • This is the equation which governs the flow of fluids such as water and air. However, there is no proof for the most basic questions one can ask: do solutions exist, and are they unique? Why ask for a proof? Because a proof gives not only certitude, but also understanding.
  • Waves follow our boat as we meander across the lake, and turbulent air currents follow our flight in a modern jet. Mathematicians and physicists believe that an explanation for and the prediction of both the breeze and the turbulence can be found through an understanding of solutions to the Navier-Stokes equations. 
  • Although these equations were written down in the 19th Century, our understanding of them remains minimal. The challenge is to make substantial progress toward a mathematical theory which will unlock the secrets hidden in the Navier-Stokes equations.
The problem is still open. No solution is even in sight after 24 years. No progress at all.

Another main open problem of mathematical physics is the 2nd Law of Thermodynamics, which in particular applies to the flow of fluids such as water and air as governed by Navier-Stokes equations. 

It is thus possible to view the Clay Navier-Stokes Problem as an instance of the 2nd Law of Thermodynamics, and so be reformulated into:
  • Mathematical proof of the 2nd Law of Thermodynamics for fluids.       (P)
This version has a more obvious significance and it is possible that a solution can be found and so increase understanding in the spirit of Clay.

A resolution to (P) is presented in recent posts on the 2nd Law.   

I have sent the following letter to the President of the Clay Institute and will report reaction:

Dear President 

No progress towards a solution to the Clay Millennium Problem on Navier-Stokes equations has been 
made over a period of 24 years. A reformulation into a problem which possibly can be solved may better 
meet the stated Clay objective of increasing understanding. 

Thus I suggest a reformulation into a mathematical proof of the 2nd Law of Thermodynamics for fluids as 
expressed here:


Sincerely
Claes Johnson
prof em applied mathematics Royal Institute of Technology Stockholm