fredag 6 maj 2022

Computational vs Statistical Physics

Statistical physics was created by Boltzmann (1844-1906) in an attempt to explain observed irreversibility of thermodynamic processes as a necessary evolution from more ordered/less probable to less ordered/more probable states in a microscopic particle-collision model of a gas. This was captured in Boltzmann's macroscopic equations derived from an assumption of molecular chaos (StossAnzahlAnsatz) stating that particle velocities prior to collision are uncorrelated. Boltzmann's H-theorem states that a gas left alone will approach a uniform rest state with a Maxwellian velocity distribution. 

Statistical physics is based on some assumption of statistical nature, such as molecular chaos, to be compared with computational physics where the evolution of a gas as a collection of colliding particles is simulated simply by computing the trajectories of all particles subject to collision with chaos/unordered motion as an emergent phenomenon without any assumption. 

One can argue that computational physics is real physics because particle trajectories subject to Newton's laws of motion = real physics,  are computed. On the other hand, statistical physics is not real physics in the sense that real physics cannot do statistics and decide to evolve according to an assumption of molecular chaos.  

On the other hand it is possible for human beings to do statistics by computing mean values and standard deviations in particle-collision models. 

Statistical physics was developed before the computer when computational physics could not deliver. Today with the computer computational physics can answer the questions posed in statistical physics, see Euler Right! showing physics emerging in a discrete finite element model by computation. 

PS The classical approach is to derive a continuum model in the form of a partial differential equation from a particle model. A computational model can then be derived by discretising the differential equation using the finite element method, which can be viewed as a form of particle method, in a way closing the circle with the particle model as the real model and the continuum model as a fictional model.  


måndag 2 maj 2022

Second Coming of the 2nd Law

The 2nd Law of Thermodynamics involving the concept of entropy, is surrounded with mystery. John von Neumann (1903-1957) was a very clever mathematician who offered the following advice: 

  • No one really knows what entropy really is, so in a debate you will always have the advantage (by pretending that you know).

Computational Thermodynamics (see also EulerRight!) presents a New 2nd Law of Thermodynamics (without reference to entropy) resulting from the Euler equations for a compressible gas subject to finite precision computation in the following form, with the dot signifying time differentiation:

  • $\dot K = W - D$   (1)
  • $\dot E = -W + D$  (2)

where $K$ is kinetic energy, $E$ internal (heat) energy, $W$ is work and $D > 0$ is turbulent/shock dissipation. The sign of $D > 0$ sets the direction of time with always transfer of energy from kinetic to heat energy, filled with content noting that $W>0$ in expansion and $W<0$ in compression. By summing (1) and (2) the total energy $K+E$ is seen to remain constant over time. The novelty of the New 2nd Law is that $D > 0$ is seen to be a consequence of finite precision computation combined with complexity expressed as a necessary presence of turbulence and shocks making exact solution of the Euler equations impossible, and with $D>0$ as a cost of large scale kinetic energy turned into small scale kinetic energy perceived as heat energy produced by turbulent/shock dissipation. 

The Standard 2nd Law takes the form 

  • $T\dot S=\dot E+W=D$    (3)  

with $T>0$ temperature and $S$ is named entropy supposed to satisfy $\dot S>0$. The mystery of the standard 2nd Law is to give entropy a physical meaning and motivate why the entropy $S$ necessarily increases. An answer was attempted by Boltzmann using statistics with supposedly $\dot S>0$ expressing a necessary irreversible development over time from more ordered states to less ordered states.

Comparing the New 2nd Law and the Standard 2nd Law we see that $T\dot S = D$ and thus $\dot S>0$ is a result of $D>0$ seen as entropy production and so the basic question concerning the 2nd Law is to motivate why $D>0$. The answer given by finite precision computation + complexity is that turbulent/shock dissipation $D$ can be seen as a necessary positive cost to pay for not being able to solve the Euler equations exactly (resorting to residual stabilisation), which in physical terms corresponds to large scale kinetic energy being destroyed/transformed into small scale kinetic energy in a process of turbulent/shock dissipation, which is irreversible because it is impossible in finite precision to reconstruct ordered large scale kinetic energy from unordered small scale kinetic energy. 

To connect to Boltzmann it is possible to see this transformation as turbulent/shock dissipation from large to small scale kinetic as a destruction of order. The new aspect is that this destruction of order has an explanation as a necessary consequence of finite precision + complexity without resort to statistics. 

Another aspect is that $D<0$ would by (1) generate large scale kinetic energy $K$ in unphysical blow-up, while the generation of small scale kinetic energy as heat energy $E$ with $D>0$ in (2) is a stable physical process without blow-up. 

Also note that in the Standard 2nd Law, assuming that $D$ results from a dissipative mechanism carries the information that $D>0$, but the question why $D$ is a result of dissipation is left without answer. The New 2nd Law gives an answer as finite precision computation + complexity.  

The New 2nd Law connects physics to computation with physics seen as a form of finite precision computation taking a system from one time level to a next. 

PS1 You can make an analogy by viewing criminality in two different ways: The Standard way would be to say that all people are criminal, more or less, and so there is no wonder that criminality exists. The New way would be to say that it is impossible for everybody to exactly satisfy the requirements of law and order in the presence of large inequalities and so a non-zero amount of criminality (depending on the amount of inequality) cannot be avoided. Moreover, society can be stable to small scale crime but not large scale.

PS2 Recall that thermodynamics was founded 1865 on the idea that  
to be compared with what von Neumann said about entropy.

PS3 The basic question can also be formulated:
  • How can a system which (formally) is reversible be irreversible?
  • How can a dissipative effect arise in a system which (formally) has no dissipation?
My answer is finite precision computation + complexity. The standard answer is statistics (or molecular chaos, see PS4).

PS4 Boltzmann tried to justify a 2nd Law through the Boltzmann equations, which he derived from an assumption of molecular chaos in a particle collision model, explained on Wikipedia as follows:
  • A key insight applied by Boltzmann was to determine the collision term resulting solely from two-body collisions between particles that are assumed to be uncorrelated prior to the collision. This assumption was referred to by Boltzmann as the "Stosszahlansatz" and is also known as the "molecular chaos assumption”.
But how can you verify this assumption? It was used by Boltzmann to derive his H-theorem viewed as a (much disputed) proof of the 2nd Law. There does not seem to be any convincing mathematical proof/justification of the 2nd Law beyond the H-theorem.  

PS5 Another (standard) approach is to take mean values in a particle-spring (or quantum mechanical) model, which will introduce diffusion. But Nature does not care to compute mean values and so the physicality of such a model (with diffusion arising for mean values) can be questioned. 

PS6 By rubbing your hands against each other (large scale motion) you can by friction generate heat (small scale motion) thus increasing the temperature. But you cannot get your hands rubbing by allowing their temperature to decrease. It remains to explain why there is friction between your hands and why friction generates heat.

PS7 The residual stabilisation enters as a loss of kinetic energy reflecting that the Euler equations cannot be solved exactly thus leaving a non-zero residual, thus introducing a friction effect from violation/rupture of physics as an effect of rubbing.