onsdag 15 maj 2024

Feyman Doubling Down by QED


Feyman diagrams revealing the deepest secrets of Nature.

This is a continuation of the previous post on the evolution of theoretical physics during the 20th century in a sequence of doubling downs to a new more complex theory when facing difficulties with an old theory instead of resolving the difficulties. Let us take a look at the step from the Quantum Mechanics QM of atoms of the 1920s to the Quantum Electro Dynamics QED of the 1950s including light preparing for Quantum Field Theory QFT as the ultimate quantum theory of modern physics underlying the Standard Model of elementary particles.   

QED was presented to a general public by its leading proponent Richard Feynman in 4 lectures later collected into the book QED The Strange Theory of Light and Matter. 

Newton's primitive idea of light as a stream of light particles was in the late 19th century replaced by Maxwell's equations, where in particular light appears as a wave carried by oscillating electromagnetic fields. Maxwell's equations concisely captures all of electromagnetics as the top jewel of classical mathematical physics. 

Einstein refuted Newton in 1916 and Feynman refuted Maxwell in 1948 and so returned to Newton: Light is a stream of particles named photons. In the book Feynman's argues: 

  • The theory of QED describes Nature as absurd from the point of view of common sense. 
  • We have so far, found nothing wrong with QED. It is therefore the jewel of physics- our proudest possession.
  • It is very important to understand that light behaves as particles, especially for those of you who have gone to school, where you were probably told something about light behaving like waves.
  • We were talking about light. The first important feature about light is that it appears to be particles: when very weak monochromatic light (light of one color) hits a detector, the detector makes equally loud clicks less and less often as the light gets dimmer
  • Newton thought that light was made up of particles- he called them "corpuscles"—and he was right (but the reasoning that he used to come to that decision was erroneous). We know that light is made of particles because we can take a very sensitive instrument that makes clicks when light shines on it, and if the light gets dimmer, the clicks remain just as loud-there are just fewer of them. Thus light is something like raindrops-each little lump of light is called a photon—and if the light is all one color, all the "rain- drops" are the same size.
  • So now, I present to you the three basic actions, from which all the phenomena of light and electrons arise. ACTION #1: A photon goes from place to place. ACTION #2: An electron goes from place to place. ACTION #3: An electron emits or absorbs a photon.
  • The problem is, when we try to calculate all the way down to zero distance, the equation blows up in our face and gives meaningless answers-things like infinity. This caused a lot of trouble when the theory of quantum electrodynamics first came out. 
  • People were getting infinity for every problem they tried to calculate!
We see here typical ingredients of doubling down raising the bid so that it cannot be called as in poker: 
  1. The theory is absurd beyond comprehension. 
  2. The theory is perfect with perfect agreement with observation. 
  3. The theory appears to give meaningless results. 
  4. The theory is primitive as being reduced to Action #1-3.   
Since the theory is incomprehensible it cannot be questioned or called. That the theory shows perfect agreement with observation is impossible to check, since direct application of the theory by a non-expert gives meaningless infinities.

The reason QED produces infinities is Action #3 which opens to self-interaction: An electron emits a photon and then absorbs the photon, as discussed in this post showing the danger of self-feed back. 

Feynman like Einstein propelled himself to be the golden boy of modern physics as expressed in the book Genius: The Life and Science of Richard Feynman. In later years his geniality has been questioned by e g Freeman Dyson in particular the mystery of self-interaction.

The argument that light is shown to be a stream of particles because light can give rise to "clicks" of an instrument, is as silly as saying that a good joke is a particle because it can give rise to a laugh.  


PS The Standard Model divides particles into matter (fermions) and force carriers (bosons) with photon the massless carrier of the electromagnetic force. The trouble with this double set of particles with matter particles interacting by forces transmitted by force carrying particles, in the above setting electrons interacting by emitting and absorbing photon particles, is that it opens to self-interaction going wild into infinities. In the setting of gravitation this would require some form of graviton particles, which have not been found. Instead, gravitational force appears to come from a gravitational potential field present everywhere. The trauma of modern physics is that the Standard Model is incompatible with gravitation. The trouble comes from the role of bosons as force carrying particles. If there are no gravitons, maybe there are no bosons either? And then what about QED and QFT?   

tisdag 14 maj 2024

Doubling Down in Theoretical Physics and Geopolitics

Physical scales range from $10^{-18}$ to $10^{22}$ meter about 40 orders of magnitude. 

The US Empire is now collapsing in front of our eyes along with the rest of the West, while the East/South is taking over supported by 90% of the world population. Doubling down by printing more dollar does not seem to work any more. No even the entry of Sweden into NATO will help out.

Geopolitical power is exercised through economical and military power resting on technology based on mathematics and physics.  

The rise of West came with the scientific revolution initiated by Galileo, Descartes, Leibniz, Newton, Euler from a new idea of God as mathematician or theoretical physicist creating a Universe governed by natural laws of mathematical form, like the inverse square law of gravitation. This propelled the West to World domination with the power shifting from Europe to US after WW2. 

The development of the atomic bomb by scientists gathered in US from Europe during WW2 boosted the self-confidence of US theoretical physicists of the 1950-70s into exploration of ever smaller atomic scales in search for a Theory of Everything uncovering God's ultimate mathematical trick. But God showed to be too smart and unsurmountable problems had to be met by doubling down. Here is a time line for physics on ever smaller scales in each new step leaving the old problems unsolved: 

  • Faced with difficulties of extending classical deterministic physics to the microscopics of atoms,  theoretical physicists doubled down in the 1920s by presenting Quantum Mechanics as a new form of probabilistic physics on scales of size $10^{-10}$ meter.
  • Faced with seemingly unsolvable foundational problems of Quantum Mechanics, theoretical physicists doubled down in the 1950s by presenting Quantum Field Theory as a new form of physics on sub-atomic scales of size $10^{-18}$ meter. 
  • Faced with seemingly unsolvable foundational problems of Quantum Field Theory,  theoretical physicist doubled down in the 1970s by developing String Theory as a new form of physics on sub-sub-atomic scales of size $10^{-32}$ meter.

Here is corresponding time line for relativity theory on ever larger scales: 
  • Faced with an apparent absence of an aether as a physical medium for the propagation of light, Einstein seeking to get an academic position as physicist in 1905,  decided to change the classical Euclidean/Newtonian concepts of space and time into a new concept of space-time in a new Special Theory of Relativity.
  • Faced with questions from leading physicists, which he could not answer, Einstein doubled down by extending to a new General Theory of Relativity.  
  • Faced with new questions, Einstein doubled down restricting his theories to the whole Universe making experimental falsification impossible.    

Today the doubling down has come to an end in a theoretical physics troubled by seemingly unsolvable problems. It is not possible to double down to smaller scales nor to larger scales. Theoretical physics of the West has come to a dead end. 

Or can AI help by doubling and redoubling human intelligence? See also this post.

Is it possible that this is the true reason that the geopolitical power of the West is now collapsing?

PS Doubling down in mathematics can take the form of higher level of abstraction. The problem that the square root of two is not a rational number, which destroyed the Pythagorean mathematical school, could eventually be handled in the 19th century by introducing a new more abstract concept of real number, which was then repeatedly doubled down into vast Hilbert spaces capable of housing the quantum mechanical wave function for the whole Universe. 

torsdag 9 maj 2024

Newton or Einstein or Newton?

Recall this post.

A physics student of today will to pass exams have to say that Einstein's theory of gravitation is more precise than Newton's theory of gravitation, not less precise. 

The evidence supplied by the book is that the elliptic orbit of Mercury around the Sun shows a slow turn of main axis (precession) of 5600 seconds of arc per century, while Newton predicts 5557 taking in the influence of all other planets and celestial objects, and the missing 43 is predicted by Einstein as a small correction to Newton.  

In Einstein's theory the speed of gravity is equal to the speed of light in vacuum, while in Newton's theory the speed appears as being infinite. 

In a two-body problem of one small planet and a big Sun, the planet will according to Newton follow an elliptic orbit without precession, because the gravitational force on the planet acts in the direction of the Sun without delay from finite speed of gravity. With a finite speed of gravity there would be a delay which would make the planet slowly orbit away from the Sun. Massive observations of elliptic planet orbits show perfect agreement with Newton's theory. Here Einstein's theory can only be less precise.  

Finite speed of gravity thus appears to disagree with observations, at least in a Newtonian setting. To save Einstein's theory it is necessary to show that it contains some form of non-Newtonian effect which exactly cancels the delay effect of a finite speed of gravity. It is doubtful that this is possible. 

Newton's theory appears to be perfect (except for Mercury). Einstein's theory thus appears as a less precise version of Newton's theory (except for Mercury), in contradiction to the above book answer: Newton is more precise than Einstein. 

Concerning the precession of Mercury it is natural to ask: 

  • What is the accuracy of Newton's prediction of 5557? Is everything taken into account?
  • What is the accuracy of Einstein's prediction of 43 as a correction to Newton? 
  • In order to match the observation of 5600, it is necessary that both 5557 and 43 are correct. Possible?   

If Einstein is less precise than Newton, what does that say about modern physics vs classical physics?

What did Einstein mean by saying "Newton, forgive me?" What what so wrong that an excuse was needed? 

To correct Newton requires a full understanding of all aspects of Newton's gravitation, and then a full understanding of Einstein's gravitation as a correction. Is there any living physicist with this intellectual capacity?

The Nobel Prize in Physics 2017 was awarded "for decisive contributions to the LIGO detector and the observation of gravitational waves" in the form of an extremely weak signal from an extremely massive collision of two black holes at an extremely large distance 1.3 Billion light years away from the detector. This was viewed as extremely strong evidence that Einstein is correct.  

onsdag 8 maj 2024

Max Born on Copenhagen Interpretation




Max Born played an important role in forming the new quantum mechanics based on Schrödinger's wave equation presented in 1925, by giving its wave function solution a statistical interpretation as electron configuration probability instead of actuality as in classical deterministic mechanics.

It was the multidimensional form of the wave function with $3N$ space dimensions for a system with $N$ electrons without meaning in classical deterministic mechanics in 3 space dimensions, which required a non-classical interpretation and it was Max Born who in 1926 came up with the idea of letting the multi-dimensional wave function represent an immensely rich world of possibilities rather than a real world of actualities in a step from determinism to new quantum world ruled by games of roulette. 

Schrödinger received the Nobel Prize in Physics in 1933 for "the discovery of new productive forms of atomic theory", while it Born had to wait until 1954 to be awarded for "the statistical interpretation of the wave function". 

In his Nobel Lecture Born describes his mixed feeling about quantum mechanics in general and his work in particular: 
  • It contains no discovery of a fresh natural phenomenon.
  • It contributed to the solution of an intellectual crisis into which our science had fallen as a result of Planck’s discovery of the quantum of action in 1900. 
  • Today, physics finds itself in a similar crisis. 
  • Planck, himself, belonged to the sceptics until he died. Einstein, De Broglie, and Schrödinger have unceasingly stressed the unsatisfactory features of quantum mechanics and called for a return to the concepts of classical, Newtonian physics while proposing ways in which this could be done without contradicting experimental facts. Such weighty views cannot be ignored. 
  • Niels Bohr has gone to a great deal of trouble to refute the objections. I, too, have ruminated upon them and believe I can make some contribution to the clarification of the position. 
  • The matter concerns the borderland between physics and philosophy.
  • What is the reality which our theory has been invented to describe?
  • The answer to this is no longer physics, but philosophy.
  • The lesson to be learned from what I have told of the origin of quantum mechanics is that probable refinements of mathematical methods will not suffice to produce a satisfactory theory, but that somewhere in our doctrine is hidden a concept, unjustified by experience, which we must eliminate to open up the road.
In short, Born was not very happy with the state of quantum mechanics 30 years after it was invented. Physics was in a state of crisis in 1900, 1925, 1954 and still is. The cure of statistics has not worked. RealQM presents an alternative without statistics. 

tisdag 7 maj 2024

From Statistical Mechanics to Quantum Statistics


Boltzmann was not easy to argue with.

The development of modern physics follows a path from statistical mechanics (Boltzmann 1866) over statistics of energy quanta of blackbody radiation (Planck 1900) to quantum mechanics statistics (Born 1926). In each case there was a pressing demand from empire power politics on theoretical physicists to assist in advancement of new technologies, from steam engines to atomic energy:

  1. Boltzmann took on the challenge to give the 2nd Law of Thermodynamics a rational mathematical physical meaning as the outstanding problem of the mid 19th century.  
  2. Planck took on the challenge to give blackbody radiation a rational mathematical physical analysis, as the outstanding problem of classical physics at the end of the 19th century. 
  3. Born took on the challenge to give the wave function of the new quantum physics a scientific meaning.       
Boltzmann failed and so invented statistical mechanics based on an idea of macro-states supported by micro-states with probability increasing with number of micro-states. 

Planck failed and inspired by Boltzmann invented a way of computing energy quanta with probability decreasing with increasing frequency. 

Born inspired by Planck resorted to statistics by giving the wave function a meaning as probability of electron configuration. 
 
The use of statistics is fundamentally different in all three cases, but Boltzmann started it all.

This means that modern physics largely is statistical physics. Is this a problem? It means giving up the essence of classical physics as rational deterministic physics based on cause-effect. In statistical physics things happen by chance and rationality is lost. It may be a high price to pay, in particular if it is not really necessary.  

Real Quantum Mechanics offers an new approach to quantum mechanics without statistics.

Something Rotten in Copenhagen Interpretation



Modern physics is based on the Copenhagen Interpretation CI of the wave function $\Psi$ as solution to Schrödinger's equation as a full description of all of atom physics. For an atomic system consisting of $N$ electrons labeled $n=1,2,...,N$, the (complex-valued) wave function has the form  

  • $\Psi (x,t)$
with each electron $n$ being connected to a 3d Euclidean space $E_n$ with coordinates $x_n$ collected as a $3N$-dimensional coordinate $x=(x_1,x_2,...x_N)$ and a common time coordinate $t$. In CI
  • $\vert\Psi (x,t)\vert^2$
represents the the possible appearance of an electronic configuration at time $t$ with electron $n$ appearing as a particle at space coordinate $x_n$ for $n=1,2,...,N$. 

More precisely, $\vert\Psi (x,t)\vert^2dx$ is viewed to be the probability of "finding" an electron configuration within the volume $dx$ around $x$ as an act of an Observer, see PS2 below.   

The very rich $3N$-dimensionality of $\Psi (x,t)$ with a unique 3d Euclidean space $E_n$ reserved for each electron $n$, puts CI outside classical deterministic physics taking place in a common shared 3d Euclidean space $E$, and then outside the rationality of the scientific revolution.  

In CI electrons appear in 3d worlds which are entirely separate but also overlap into a common 3d world. This is not easy to grasp and is the root cause of the never-ending debate about the physical meaning of CI with "shut up and calculate" as desperate dictate.  

Schrödinger's equation contains the following elements with $\nabla_n$ gradient with respect to $x_n$:
  • Atomic kernels as point charges with corresponding Coulomb kernel potentials.   ( 1)
  • $\vert\nabla_n\Psi\vert^2$ as kinetic energy of electron $n$.                                                                       (2)
  • Coulomb potential between each pair of electrons $x_i$ and $x_j$ for $i\neq j$.          (3)     
Here (1) acts in a common space $E$ while(2) acts in each separate $E_n$, which is comprehensible but unphysical. But (3) has a double function which is not comprehensible unless you are a believer in CI.   

The total inter-electronic Coulomb potential energy arising from (3) is given by
  • $\sum_{i<j}\int\frac{\vert \Psi (x,t)\vert^2}{\vert x_i-x_j\vert}dx$      (EP)
We see here the presence of all the separate $E_n$ but also the shared presence in $\vert x_i-x_j\vert$.
Each electron in CI thus lives in a separate world, but also appears in the separate worlds of all the other electrons. 

This is not possible in a classical deterministic common world, and so CI presents instead a probabilistic world as a World of Possibilities, instead of a classical world of actualities. 

This is a world so rich that even a system with moderate number of electrons, would span more possibilities than the number of atoms in the whole known Universe. To handle this absurdity CI reduces wave functions to be either symmetric or anti-symmetric, but the number of possibilities is still overwhelming.

A symmetric wave function $\Psi (x_1,x_2,...,x_N,t)$ does not change under permutation of variables which in CI is viewed to signify that electrons lack individuality in space and time, and so that the labelling $n=1,...,N$, has no real physical meaning. Yet each electron has its own 3d space and also appears with individuality in space and time in the shared distance $\vert x_i-x_j\vert$ in (EP). 

There is a general agreement that quantum mechanics in the form CI cannot be understood or visualised in some sense, but still serves as the foundation of all electronic technologies of  modern society. 

The basic difficulty of understanding/visualisation comes from the mix of possibility and actuality carried by the wave function: Electrons interact by actualities in space and time involving individuals within deterministic physics, while electrons also appear as possibilities without individuality outside deterministic physics. Understanding the root of a difficulty is the first step to come to grips with it.

Real Quantum Mechanics presents a form of quantum mechanics within the realm of classical deterministic physics which can be understood, and may well serve electronic technology better than CI. Why not give it a try?

PS1 CI connects to the Monadology of Leibniz with monads as simple substances each one within its own 3d space yet with a window open to interaction with other monads. But Monadology is today not viewed to be science because modern physicists cannot understand it.  

PS2 The world of possibilities of CI is not that of statistical mechanics describing a world of most probable actualities. In CI a possibility becomes an actuality in the act of observation by an Observer causing the wave function to collapse, which is maybe the deepest mystery of CI. The role of an Observer is not crucial in classical mechanics nor in statistical mechanics arising from difficulty of observation in classical mechanics.

söndag 5 maj 2024

Man-Made Confusions about Identity

In the light of quantum theory these elementary particles are no longer real in the same sense as objects of daily life, trees or stones, but appear as abstractions derived from the real material of observation in the true sense. (Werner Heisenberg)

In the Copenhagen Interpretation CI of quantum mechanics as the present standard of modern physics established 100 years ago, quantum particles like electrons do not carry the physical identity of classical particles, which even of the same kind can be identified by their unique trajectories in space and time without jumps. Electrons in CI lack such physicality and appear rather as abstract possibilities than actualities. 

This makes CI into a non-physical theory impossible to visualise, a theory without Anschaulichkeit in the words of Schrödinger. This is the root of the present crisis of modern physics. 

Recent posts exhibited the complications arising from the lack of identity of the quantum particles of the CI. These are man-made complication, which do not arise in real physics where identity is not compromised. 

Complications arising from confusion of identity also arise in man-made worlds outside physics. The presence of several copies of a picture or text in the memory of the computer may cause confusion if not all copies are updated the same way. The presence of copies of works of art upsets the art market. The same with branded clothes and bags. The presence of doubles in politics adds to confusion. AI brings further confusion since author identity is missing and intellectual property has no owner. 

Quantum mechanics as man-made physics is full of complications from confusion of identity. All the troubles come from denying an electron physical identity in space and time. It is like denying a person to be citizen of a country, to condemn the person to be stateless or homeless without a passport and bank account, to be a no-person. How to handle electrons or persons without identity? 

Classical physics as real natural physics existing independent of human speculation is not man-made in the same sense and there is no lack of identity. 


Quantum Mechanics vs Crisis of Modern Physics

Schrödinger cancelled by Bohr

The roots of the present crisis of modern physics can be traced 100 years back to the discovery of quantum mechanics based on the Schrödinger's equation in 1925, which directly triggered a fierce controversy concerning its physical meaning. Here Schrödinger/Einstein were outmanoeuvred by Bohr/Born/Heisenberg declaring the Copenhagen Interpretation CI to be the belief all physicists had and still have to confess to. But Schrödinger/Einstein never accepted CI and the unresolved trauma is today manifested in particular in a discussion about Identity, Individuality and Indistinguishability in Physics and Mathematics.

In classical mechanics particles follow continuous trajectories in space-time, which do not overlap, and so can be used to identify particles even of the same type in all other respects.  Particles do not jump from place to place and so can be distinguished. 

This is not so in quantum mechanics: A solution to the Schrödinger equations for a system of $N$ particles (electrons) takes the form of a multi-dimensional wave function $\Psi (t, x_1,...,x_n)$ depending on $N$ three-dimensional coordinates $x_1,x_2,...,x_N,$ and a common time coordinate $t$. In CI the square of the wave function is supposed to represent the probability of a realisation of quantum particle configuration at time $t$, with quantum particle $n$ appearing at position $x_n$ for $n=1,...,N$. Wave functions are either symmetric in the variables $x_n$ representing bosons such as photons, or anti-symmetric representing fermions such as electrons. In CI it is not possible to identify particle trajectories and so quantum particle identity is lost. Yet quantum particles are labeled and have identity in the sense that two quantum particles (electrons with the same spin) cannot share position in space-time. Quantum particles appear as possibilities without identity and continuity in time, to be compared with  classical particles as actualities with identity continuous in time. 

CI is not an ontological model of what is, but something very different still without scientific meaning despite 100 years of brooding by the greatest of human minds. A basic difficulty is that there are so many more possibilities than actualities. Interaction of actualities like collision of classical particles has meaning, while interaction of possibilities appears to be meaningless. Documentation of actualities may be possible, while documentation of all possibilities is impossible. CI is uncomputable.

The nightmare of CI is the idea of collapse of the wave function as the step from possibility to actuality or from anonymity to identity believed to take place upon observation. This is the event of an Observer opening Schrödinger's cat box to find the cat either dead or alive, convinced that before the opening the cat is in a state of all possibilities or superposition of both alive and dead. Schrödinger thus demonstrated the absurdity of CI, but was cancelled by Bohr and there we are today. 

All the questions from 1925 remain about the possible meaning of a physics without identity and continuity in time. In particular the possibility of self-destruction by self-interaction in the absence of individuality and identity discussed in recent posts.   

Real Quantum Mechanics presents a new form of quantum mechanics where identity and continuity in time is restored, a form of physics which is computable.  

Suddenly the Wave Function Collapsed into a Rabbit.

    


tisdag 30 april 2024

Crisis of Modern Statistical Physics vs Classical Deterministic Physics

This is a further comment on Leibniz Principle of Identity of Indiscernibles seemingly in conflict with the modern physics main-stream idea of electrons all alike like equal probabilities of outcomes of tossing a fair coin. 




That modern physics is in a state of deep crisis is acknowledged by leading physicists and also largely understood by the general public. Buzz words like dark energy, dark matter, inflation, Big Bang, multiversa, entanglementcollapse of the wave function,  particles and quarks, are floating around as elements of relativity theory on cosmological scales and quantum mechanics on atomic scales, both formed 100 years ago but still today harbouring toxic unresolved foundational problems, and on top of that being incompatible. A veritable mess. 

The root of the seemingly unresolvable problems of quantum mechanics can be traced back to the statistical interpretation of the multi-dimensional Schrödinger wave function as solution to the multi-dimensional Schrödinger equation serving as foundation. 

While classical physics is ontology about what reality is, modern physics is epistemology about what can be said. While classical physics is deterministic physics independent of human observer, modern physics in the form of quantum mechanics is statistical physics depending on human observers acting as mathematical statisticians in an insurance company busy computing insurance premiums. 

The departure from classical into modern physics was initiated by Boltzmann in the late 19th century seeking an ontological realistic explanation of the 2nd Law of Thermodynamics as the main unresolved problem of classical physics giving time a direction, which had to be resolved to save physics from disbelief. When Boltzmann understood that he could not reach this main goal of his scientific life, he made a Faustian deal in the form of an explanation based on statistical mechanics. This served to save the life of physics, but not Boltzmann's own life, and opened the door into the heaven of modern physics as quantum mechanics as statistical mechanics, which is now in a state of crisis. 

The step from deterministic physics to statistical physics, was taken in order to save classical physics from credibility collapse in front of the 2nd Law. The medication worked for the moment but the patient as classical physics died and so was replaced by modern physics, which however showed to be quite sick without any cure in sight still today. 

The first step in coming to grips with the crisis of modern physics, is to ask if it is impossible to explain the 2nd Law within classical deterministic physics? If not, then the step to statistics is not necessary and much trouble can be avoided. More precisely, it appears to be possible to replace statistics by a concept of finite precision physics as presented in Computational Thermodynamics and in popular form in The Clock and the Arrow with follow up into a realistic deterministic form of quantum mechanics as Real Quantum Mechanics

This means a return to deterministic physics with a new element of finite precision computational physics coming with resolutions of problems of classical physics making it possible to avoid paying the very high price of taking the drug of statistical physics. 

Real physics is what it is and is given to us for free. Statistical physics is man-made physics, which needs massive data and human interference. Real physics seeks to describe the World as it is, while modern physicists have the reduced goal of statistical prediction outcomes of man-made experiments. Schrödinger and Einstein could not accept physics as man-made statistics, but were cancelled. Maybe the present crisis can open to restart following their spirit?  

We may view real physics as a form of engineering or professional soccer game with basic questions: What is the basic mechanism/principle? How to improve it? On the other hand, a statistical physicist simply watches the game on TV and finds meaning in betting.  


måndag 29 april 2024

Cancellation of Self-Interaction as Renormalisation


The apparent clash between Leibniz Principle of Identity of Indiscernibles PII and the Copenhagen Interpretation of Quantum Mechanics (StdQM) has triggered quite a bit of discussion surveyed in the book Identity in Physics: A Historical, Philosophical and Formal Analysis.

The trouble is rooted in the interpretation of the wave function of stdQM as expressing probabilities of possible electron particle configurations. 

This is to be compared with actual real configurations as in Real Quantum Mechanics RealQM  in a sense of classical physics with non-overlapping charge densities with unique presence in space-time as expression of identity. 

PII is in harmony with classical physics and RealQM, but not with StdQM. 

Schrödinger as inventor of quantum mechanics could not accept the probabilistic interpretation of StdQM, and so was cancelled by the leading Copenhagen school of Bohr, Born and Heisenberg. 

We may ask if PII is of real importance or only of some scholastic philosophical virtual importance?

The previous post brought up the idea that PII connects to self-interaction as a toxic element of Quantum Field Theory QFT as the generalisation of StdQM underlying the Standard Model capturing all of elementary atomic particle physics. It is manifested in the appearance of "infinities" asking for "renormalisation" to be cancelled, like techniques to ignore elephants in the room. 

In classical physics prevention of self-interaction is possible because it is possible to distinguish each particle from all other particles and so to guarantee in particular that the electric/gravitational field created by a particle only affects other particles but not itself. This is the nature of Newton's Law of gravitation and Coulomb's Law. 

But StdQM describes probabilities of possible particle configurations, which lack particle paths and so lack identity over time. In StdQM bosons (such as photons) can occupy the same position in space-time as well as some fermions (such as electrons with different spin), and particle paths have no meaning. In this setting  self-interaction cannot easily be prevented, and so ask for extra-ordinary techniques for cancellation in the form of "renormalisation".  Nobody is happy with this trick introduced to handle a fundamental difficulty of physics as statistics. 

The possibility that a specific particle occupies some specific position in space-time and the possibility that another particle does the same thing do not appear to be mutually exclusive, which means that particle identity is lost. Probably. Statistics is tricky.

The problem with self-interaction is that it has to steer way from both blow-up to infinity (too much ego) or decay to zero (too much self-criticism) in a very delicate balance threatened by instability.

Recall that the electron of Hydrogen atom is prevented from disappearing into the potential hole of the proton kernel by the presence of the Laplacian in Schrödinger's equation giving the electron an extension in space as a charge density. Likewise the Earth is saved from being swallowed by the Sun by orbiting the Sun as a form of spatial extension.

From the above book:

  • It is not clear how collections of non-individual objects can be captured by standard set theory. 
  • As the mathematician Yuri Manin put it: “We should consider possibilities of developing a totally new language ...” to deal with collections of entities which do not behave as standard sets (in the sense of obeying the axioms of the usual set theories), since the “new quantum physics has shown us models of entities with quite different behaviour. 
  • Even ‘sets’ of photons in a looking-glass box, or of electrons in a nickel piece, are much less Cantorian than the ‘set’ of grains of sand”.
  • It is our intention in this book to explore these different issues and, in particular, to go some way towards developing the ‘totally new language’ suggested by Manin.
PS When young men pull on military uniform they lose identity and become soldiers all alike in violation of PII:



lördag 27 april 2024

Identity Politics from Identity Physics?



Leibniz's famous Principle of the Identity of Indiscernibles PII states 
  • No two things are exactly alike.
  • Coexistence of two indiscernibles is metaphysically impossible.
A basic aspect of material physics is spatial extension as unique occupancy of some region of space over some period of time as unique identity. Material spatial overlap is impossible. Each cell of your body occupies its own region or volume in space, making it distinct or discernible from all other cells thus equipped with a unique identity. The same with the H2O molecules filling the Ocean even if they all have the same composition. Even point-like bodies of the same type carry unique identity by having unique positions in space-time.

Newton's Law of Gravitation and Coulombs Law state that force acting between two distinct point-like masses or charges, scales with $\frac{1}{r^2}$, where $r>0$ is their Euclidean distance. The Laws break down for $r=0$, which means that spatial overlap of masses and charges is forbidden. This is a fundamental principle of classical physics as an expression of PII. 

But in the modern atom physics of Standard Quantum Mechanics StdQM this fundamental principle is violated: Electrons are viewed to be indiscernible and to occupy space and satisfy Coulombs law only in a statistical meaning. 

Real Quantum Mechanics ReQM offers a new approach to atom physics where electrons appear as non-overlapping charge densities with unique spatial occupancy in accordance with PII which satisfy Coulomb's Law in a point-wise physical sense. 

The total Coulomb potential energy of two non-overlapping charge densities $\phi_1$ and $\phi_2$ takes the form of an integral: 
  • $\int\int\frac{\phi_1(x)\phi_2(y)}{\vert x-y\vert}dxdy$ where $x\neq y$.       (E)
From strict mathematical point of view (E) can be viewed to be meaningful even if the charge densities overlap with formally $x-y=0$, since the volume where $\vert x-y\vert$ is small, is compensated by $dxdy$. In StdQM it is thus possible for two electrons to have the same charge density and thus overlap (if the electrons have different spin). 

Over-lapping point charges reflects self-interaction with infinite potential energy. This poses a serious problem to Quantum Field Theory for particles without identity requiring "renormalisation" to artificially remove infinities. 

Self-interaction is toxic and has to be prevented, but without identity how can you distinguish between interaction with yourself (toxic) and interaction with other people?

For a Hydrogen atom with the proton modeled as a positive point charge at $x=0$, Schrödinger's equation models the electron as a distributed density $\phi (x)$ of negative charge with finite (negative) potential energy balanced by the kinetic energy $\frac{1}{2}\int\vert\nabla\phi (x)\vert^2dx$ determining the size of the electron.    

Summary: StdQM violates PII. RealQM satisfies PII. Impossible to avoid self-interaction with unphysical  infinities if the identity of the self is not guaranteed.

Is this important? Is PII a basic principle of physical material existence, which cannot be violated? 
Is Identity Physics as violation of PII really physics? 

PS A central theme of this blog is that the roots of the present accelerating break-down of principles of civilisation can be traced back to the advent of modern physics in the beginning of the 20th century with quantum mechanics and relativity theory as a form of Identity Physics.  



 

fredag 26 april 2024

Primordial Gravitational and Electric/Magnetic Potentials

Dialog between the Two Greatest World Systems with primordial potentials vs densities.  

This is a further remark to previous posts on New Newtonian Cosmology with a gravitational potential $\phi_m (x,t)$ and electric potential $\phi_c(x,t)$ with $x$ a Euclidean space coordinate and $t$ a time coordinate, viewed as primordial with mass density $\rho_m (x,t)$ and electric charge density $\rho_c(x,t)$ given by 

  • $\rho_m=\Delta\phi_m$      (1)
  • $\rho_c=\Delta\phi_c$      (2)
Here $\rho_m \ge 0$ while $\rho_c$ can be both positive and negative, and $\Delta$ is the second order Laplacian differential operator. 

The corresponding gravitational force $f_m\sim -\nabla\phi$ is attractive between positive mass densities and the corresponding Coulomb force $f_c\sim \nabla\phi_c$ is attractive between charge densities of opposite sign and repulsive for charge densities of the same sign. 

In principle $\rho_m<0$ is possible in (1), with then repulsion between mass densities of different sign which would separate large scales into Universa with positive and negative mass, where we happen to live in one with mass positive. It is thinkable that presence of negative mass density shows up as dark energy. It is thinkable that a very smooth $\Delta\phi_m$ corresponds to dark matter.  

The gravitational force $f_m$ acts on large masses at large distances. The electric Coulomb force $f_c$ acts on small small charges at small distances, which requires physics preventing charges of different sign to come too close, which is represented by the presence of the Laplacian in Schrödinger's equation. 

Including also a magnetic potential connected to the electric potential by Maxwell's equations and Newton's 2nd Law for mass motion subject to force, gives a model including Newton's mechanics, electromagnetics and gravitation, with potentials as primordial quantities from which mass and charge densities and forces are derived. Here Real Quantum Mechanics naturally fits in as a classical 3d continuum mechanics model. 

An important aspect of (1) and (2) is that $\rho_m$ and $\rho_c$ are derived by differentiation as an operation acting locally in space, which can be perceived to act instantly in time,  thus avoiding the hard-to-explain instant-action-at-distance coming with the standard view with mass and charge densities as primordial. 

The absence of magnetic monopoles corresponding to point charges makes magnetics different from electrics in the formation of electromagnetics.  

 

torsdag 25 april 2024

Temperature as Quality Measure of Energy.

In ideal gas dynamics temperature appears as an intensive variable $T$ connected to internal energy $e$ and density $\rho$ by 

  • $T=\frac{e}{\rho}$                          
with a corresponding pressure law 
  • $p=\gamma e$
where $\gamma$ is a gas constant. Internal energy is viewed as small scale kinetic energy from small scale molecular motion. Internal energy can transformed into mechanical work in expansion, which without external forcing (or gravitation) is an irreversible process.  

For a solid body viewed as a vibrating atomic lattice temperature scales with total internal energy as the sum of small scale kinetic energy and potential energy, which can be transferred by radiation and conduction to a body of lower temperature.   

In both cases temperature appears as a quality measure of internal energy as an intensive variable. 

The maximal efficiency of a Carnot heat engine transforming heat energy into work operating between two temperatures $T_{hot}>T_{cold}$ is equal to $1-\frac{T_{cold}}{T_{hot}}$. 

Radiative heat transfer form a hot body of temperature $T_{hot}$ to a cold body of temperature $T_{cold}$, scales with $(T_{hot}^4-T_{cold}^4)$ according to Stephan-Boltzmann-Planck. 

Conductive heat transfer scales with $(T_{hot}-T_{cold})$ according to Fourier.

In both cases the heat transfer from hot to cold can be seen as transformation from high quality energy into low quality energy in an irreversible process in conformity with the 2nd Law of Thermodynamics. 

The Nobel Prize in Physics in 2008 was awarded to experimental detection of Cosmic Microwave Background CMB radiation with perfect Planck spectrum as an after-glow of a Bing Bang with temperature of  2.725 K and corresponding very low quality energy.  

With radiation scaling with $T^4$ the difference between 300 K as global temperature and 3 K as deep space CMB comes out with a factor of $10^{-8}$. The contribution to global warming from CMB thus appears to be very small. 

We see from $e=\rho T$ that low density and low temperature both connect to low energy quality making both wind and solar energy inefficient compared to fossil and nuclear energy.    


Cosmic Microwave Background Radiation vs Cosmic Inflation

Cosmic Microwave Background Radiation CMB is supposed to be an afterglow of a Big Bang which started with Cosmic Inflation as a theory proposed by theoretical physicist Alan Guth as an extremely rapid expansion from a Universe of the size of a proton to the size of a pea taking place during a period of time from $10^{-36}$ to $10^{-32}$ seconds after zero time with an expansion factor of $10^{13}$.  

A common view is that Alan Guth's theory solves all three main problems of cosmology: the horizon problem, the flatness problem and the magnetic monopole problem. 

The Nobel Prize in Physics 2023 was awarded for experimental methods that generate attosecond pulses of light for the study of electron dynamics in matter, with an attosecond = $10^{-18}$ second. 

Visible light has time scale of $10^{-15}$, x-rays $10^{-18}$ and $\gamma$-rays $10^{-22}$ seconds as real physics with highest frequency presently known. Frequency is connected to energy through Planck's Law which allows determining the frequency of $\gamma$-rays by measuring the energy of $\gamma$-radiation.  

Cosmic Inflation is described follows in popular form:

  • According to the theory, for less than a millionth of a trillionth of a trillionth of a second after the universe's birth, an exotic form of matter exerted a counterintuitive force: gravitational repulsion. 
  • Guth says the existence of this material was reasonably likely.
  • Guth says that we don’t necessarily expect to answer those questions next year, but anything that makes small steps towards understanding the answers is thrilling.
If you feel that you need more information to be able to judge if Cosmic Inflation is a hoax, you can consult the following book by Guth:  The Inflationary Universe: The Quest For A New Theory Of Cosmic Origins.

The present inflation in Sweden of 10% appears to be pretty small when compared to cosmic inflation.


onsdag 24 april 2024

How to Measure Temperature

Measuring temperature accurately is a delicate procedure.

This is a comment to the discussion in recent posts of the proclaimed perfect blackbody spectrum of Cosmic Microwave Background CMB radiation with temperature 2.725 K.  

You can measure your body temperature by body contact with a quicksilver thermometer or at distance by an infrared thermometer. Both work on a principle of thermal equilibrium between source and thermometer sensor as a stable state over time. Your body is assigned the temperature recorded by the thermometer. 

Temperature can be seen as a measure of energy in the form of heat energy or vibrational energy of a vibrating system like an atomic lattice as the generator of radiation as radiative heat transfer.

Computational Blackbody Radiation offers a new analysis of radiative heat transfer using classical wave mechanics as a deterministic form of Planck's analysis based on statistics of quanta. The basic element of the analysis is a radiation spectrum from a vibrating atomic lattice: 

  • $E(\nu ,T)=\gamma T\nu^2$ for $\nu \le \frac{T}{h}$        (1a)
  • $E(\nu ,T)= 0$ for $\nu >\frac{T}{h}$                               (1b)
where $\nu$ is frequency on an absolute time scale, $T$ is temperature on a lattice specific energy scale, $\gamma$ and $h$ are lattice specific parameters and $\frac{T}{h}$ is a corresponding high-frequency cut-off frequency setting a upper limit to frequencies being radiated. Here a common temperature $T$ for all frequencies expresses thermal equilibrium between frequencies. 

It is natural to define a blackbody BB to have radiation spectrum of the form (1) with maximal $\gamma$ and high-frequency cut-off and to use this as a universal thermometer measuring the temperature of different bodies by thermal equilibrium. 

Consider then a vibrating atomic lattice A with spectrum according (1)-(2) with different parameters $\bar\gamma <\gamma$ and $\bar h >h$ and different temperature scale $\bar T$ to be in equilibrium with the universal thermometer. The radiation law (1) then implies assuming that A is perfectly reflecting for frequencies above its own cut-off:
  • $\bar\gamma \bar T = \gamma T$                                         (2)
to serve as the connection between the temperature scales of BB and A. This gives (1) a form of universality with a universal $\gamma$ reflecting the use of a BB as a universal thermometer.

In reality the abrupt cut-off after at radiation maximum is replaced by a gradual decrease to zero over some frequency range as a case-specific post-max part of the spectrum.  A further case-specific element is non-perfect reflectivity above cut-off. Thermal equilibrium according to (2) is thus an ideal case.  

In particular, different bodies at the same distance to the Sun can take on different temperatures in thermal equilibrium with the Sun. Here the high-frequency part of the spectrum comes in as well as the route from non-equilibrium to equilibrium. 

Why CMB can have a perfect blackbody spectrum is hidden in the intricacies of the sensing. It may well reflect man-made universality. 

måndag 22 april 2024

Man-Made Universality of Blackbody Radiation 2

Man-made Universality of Shape

This is a clarification of the previous post on the perfect Planck blackbody spectrum of the Cosmic Microwave Background Radiation CMB as a 14 Billion years afterglow of Big Bang as the leading narrative of cosmology physics today. See also this recent post and this older illuminating post.

The Planck spectrum as the spectrum of an ideal blackbody, takes the form 
  • $E(\nu ,T) =\gamma T\nu^2\times C(\nu ,T)$                                         (1)
where $E (\nu ,T)$ is radiation intensity depending on frequency $\nu $ and temperature $T$, $\gamma$ a universal constant, and $C(\nu ,T)$ is a universal high frequency cut-off function of the specific form 
  • $C(\nu ,T)=\frac{x}{\exp(x)-1}$ with $x = \frac{\nu}{T}\times\alpha$       (2)
where $\alpha =\frac{h}{k}$ with $h$ Planck's constant and $k$ Boltzmann's constant as another universal constant, with the property that 
  • $C(\nu ,T)\approx 1$ for $x<<1$ and $C(\nu ,T)\approx 0$ for $x>>1$.  
We see that radiation intensity proportional to $T$ increases quadratically with $\nu$ in accordance with deterministic wave mechanics, and reaches a maximum shortly before a cut-off scaling with $T$ in accordance with statistics of energy quanta, which kicked off an idea of atom physics as quantum mechanics also based on statistics.    

Computational Blackbody Radiation offers a different version of high frequency cut-off motivated by finite precision physics/computation instead of statistics of quanta opening to a deterministic form of atom physics as real quantum mechanics. The underlying physics model in both cases is that of an atomic lattice capable of generating a continuous spectrum of vibrational frequencies.

The basic assumptions behind a Planck spectrum as an ideal are:
  1. Model: Atomic lattice.
  2. Equilibrium: All frequencies take on the same temperature.
  3. High-frequency universal cut-off: Statistics of energy quanta.  
Observation show that most real blackbody spectra substantially deviate from the Planck spectrum and so have their own signature reflecting specific atomic lattice, non-equilibrium and specific high frequency cut-off lower than the ideal. Graphite is just about the only substance showing a Planck spectrum. 

This was not welcome by physicists in search of universality, and so the idea was born of deciding the spectrum of a given material/body by putting it inside an empty box with graphite walls and measuring the resulting radiation peeping out from a little hole in the box, which not surprisingly showed to be a graphite Planck blackbody spectrum. 

Universality of radiation was then established in the same way as universality of shape can be attained by cutting everything into cubical shape as was done by the brave men cutting paving stone out of the granite rocks of the West Coast of Sweden, which is nothing but man-made universality.  

The line spectrum of a gas is even further away from a blackbody spectrum. The idea of CMB as an afterglow of a young Universe gas cloud with a perfect Planck blackbody as measured by the FIRAS instrument on the COBE satellite, serves as a corner stone of current Big Bang + Inflation cosmology. 

It is not far-fetched to suspect that also the COBE spectrum is man-made, and then also Big Bang + Inflation.

lördag 20 april 2024

Can Cosmic Microwave Background Radiation be Measured, Really?

The Cosmic Microwave Background radiation CMB is supposed to be a 14 billion year after-glow with perfect Planck blackbody spectrum at temperature $T=2.725$ Kelvin K of a Universe at $T=3000$ K dating back to 380.000 years after Big Bang. The apparent 1000-fold temperature drop from 3000 to 3 K is supposed to be the results of an expansion and not cooling.  

To get an idea of the magnitude of CMB let us recall that a Planck spectrum at temperature $T$ stretches over frequencies $\nu\sim T$ and  reaches maximum radiation intensity $E\sim T^3$ near the end with a high frequency cut-off over an interval $\frac{\nu}{T}\sim 1$ (notice exponential scale):



 

The $10^3$-fold temperature drop thus corresponds to a $10^9$ decrease of maximum intensity and $10^3$ decrease in spectrum width. Intensity over width decreases with a factor $10^6$ as a measure of precision in peak frequency. 

We understand that to draw conclusions concerning a 3000 K spectrum from a measured 3 K spectrum requires a very precision on the level of microKelvin or 0.000001 K. Is this really possible? Is it possible to reach the precision 2.725 K from intensity maximum? 

Why is modern physics focussed on measuring quantities which cannot be measured, like ghosts?

CMB was first detected as noise maybe from birds visiting antennas, but the noise persisted even after antennas were cleaned and then the conclusion was drawn that CMB must be left-over from Big Bang 14 billion years ago and not from any birds of today.  Big Bang is physics, while birds is ornithology. 

fredag 19 april 2024

The Ultra-Violet Catastrophe vs 2nd Law of Thermodynamics


Classical physics peaked in the late 19th century with Maxwell's equations aiming to describe all of electromagnetics as a form of continuum wave mechanics, but crumbled when confronted with the Ultra-Violet Catastrophe UVC of heat radiation from a body of temperature $T$ scaling like $T\nu^2$ with frequency $\nu$ threatening to turn everything into flames without an upper bound for frequencies, because wave mechanics did not seem to offer any escape from UVC.  

Planck took on role of saving physics from looming catastrophe, but could not find a resolution within deterministic wave mechanics and so finally gave up and resorted to statistical mechanics with high frequencies less likely in the spirit of Boltzmann's thermodynamics and 2nd Law with order less likely than disorder. 

There is thus a close connection between UVC and 2nd Law. Boltzmann would say that the reason we do not experience UVC is that high frequencies are not likely, but the physics of why is missing. Explaining that UVC is not likely would no explain why there is not any observation UVC whatsoever. 

I have followed a different route replacing statistics by finite precision physics for UVC (and similarly for 2nd Law), where high frequencies with short wave length cannot be radiated because finite precision sets a limit on the frequencies an atomic lattice can carry as coordinated synchronised motion. In this setting UVC can never occur.

A basic mission for a 2nd Law is thus to prevent UVC. This gives 2nd Law deeper meaning as a necessary mechanism preventing too fine structures/high frequencies to appear and so cause havoc. 2nd Law is thus not a failure to maintain order over time, but a necessary mechanism to avoid catastrophe from too much order. 

Similarly, viscosity and friction appear as necessary mechanisms destroying finite structure/order in order to let the World to continue, and so not only as defects of an ideal physics without viscosity and friction. This is the role of turbulence as described in Computational Turbulent Incompressible Flow and Computational Thermodynamics.

We can compare with the role of interest rate in an economy with zero interest rate of an ideal economy leading to catastrophe over time. If there is no cost of getting access to capital, any crazy mega project could get funding and catastrophe would follow. This was the idea 2008-2023 preceding the collapse predicted to 2025. Too little friction makes the wheels turn too fast. Too much idealism leads to ruin.

torsdag 18 april 2024

The Secret of Radiative Heat Transfer vs CMB and Big Bang

A main challenge to physicists at the turn to modernity 1900 was to explain radiative heat transfer as the process of emission, transfer and absorption of heat energy by electromagnetic waves described by Maxwell's equations. The challenge was to explain why real physics avoids an ultra-violet catastrophe with radiation intensity going to infinity with increasing frequency beyond the visible spectrum. 

More precisely, the challenge was to uncover the physics of a blackbody spectrum with radiation intensity scaling with $T\nu^2$ with $T$ temperature and frequency $\nu\le\nu_{max}$ with $\nu_{max}$ a cut-off frequency scaling with $T$, and intensity quickly falling to zero above cut-off. 

Planck as leading physicist of the German Empire took on the challenge and after much struggle came up with an explanation based on statistics of energy taking the above form as Planck's Law, which has served into our time as a cover up a failure to explain a basic phenomenon in physical terms. 

Computational Blackbody Radiation offers an explanation in terms of finite precision physics setting a cut-off (scaling with temperature) on the frequency of emission from coordinated oscillations of an atomic lattice, with uncoordinated atomic motion stored as heat energy.

In this analysis heat is transferred from a body of higher temperature  to a body of lower temperature through a resonance phenomenon analogous to the resonance between two tuning forks. The essence can be described in terms of a  forced acoustically weakly damped harmonic oscillator:

  • $\dot v(t)+\nu^2u(t)+\gamma v(t)=f(t)=sin(\bar\nu t)$ for $t>0$                    (1)
where $u(t)$ is displacement at time $t$, $v(t)=\dot u(t)$ is velocity, the dot represents derivative with respect to time $t$, $\nu$ is the frequency of the harmonic oscillator and $\bar\nu\approx\nu$ that of the forcing. For radiation the damping term takes the form $\gamma\ddot v(t)$. 

Mathematical analysis shows assuming small damping with $\gamma << 1$ and near resonance with $\nu\approx\bar\nu$ and integration over a period:
  • $Output = \gamma \int v^2(t)dt \approx \int f^2(t)dt = Input$         (2)
  • Velocity $v(t)$ out-of-phase with $f(t)$.                                                                (3)
Even if it looks innocent, (2) represents the essence of Planck's Law with (3) expressing basic physics: Out-of-phase means that the interacting between forcing and oscillator corresponds to a "pumping motion" with the forcing balanced mainly by the harmonic oscillator itself and not the damping. In the acoustic case $T=\int v^2(t)dt$ and thus $Output =\gamma T$, which in the case of radiation takes the form $Output = \gamma T\nu^2$ or Planck's Law. 

Sum up:
  • Radiative balance between two bodies of equal temperature is expressed by (2).
  • Heating of a body B1 with lower temperature from body B2 of higher temperature from frequencies above cut-off for B1.  
  • High frequency cut-off effect of finite precision physics and not statistics.
  • Blackbody spectrum is continuous (all frequencies) and requires atomic lattice. 
  • A gas ha a line spectrum with selected frequencies, which is not a blackbody spectrum.
  • Cosmic Microwave Background radiation as a perfect blackbody spectrum of an after-glow of Big Bang without atomic lattice appears as very speculative, with Big Bang itself as even more speculative beyond experimental confirmation.  

    tisdag 16 april 2024

    Does a Photon have Temperature?

    The idea about the Cosmic Microwave Background CMB radiation is conveyed to the public by authoritative sources as follows starting at the creation of the Universe with a Big Bang:

    • After about 380,000 years when the Universe had cooled to around 3000 Kelvin,  photons were able to move unhindered through the Universe: it became transparent.
    • Over the intervening 14 billion years, the Universe has expanded and cooled greatly. Due to the expansion of space, the wavelengths of the photons have grown (they have been ‘redshifted’) to roughly 1 millimetre and thus their effective temperature has decreased to just 2.7 Kelvin. 
    • These photons fill the Universe today (there are roughly 400 in every cubic centimetre of space) and create a background glow that can be detected by far-infrared and radio telescopes.
    We meet the idea that photons are moving through space like some form of particles with effective temperature of 2.7 K filling the Universe as an after-glow of Big Bang. 

    But the concept of photon lacks real physics. Light does not consist of a stream of light particles named photons, but is an electromagnetic wave phenomena and as such can have a frequency and an amplitude/intensity. An emitter of light like the Sun has a temperature, while the light emitted is characterised by its spectrum as intensity vs frequency. A spectrum can give information about the temperature of the emitter with the Planck spectrum the spectrum of an ideal blackbody at a certain temperature with in particular a high-frequency cut-off scaling linearly with temperature. 

    Emitted light can be recieved by an antenna through resonance recording the frequency. It is also possible to record the temperature of an emitter by connecting the antenna to a form of radiation thermometer reading temperature from radiative equilibrium, in the same way as a common thermometer reads the temperature of a source by direct contact/equilibrium.  

    But is more difficult to read a spectrum since properties of emissivity, transmissivity and absorptivity as well as view angles enter. In the absence of information a Planck spectrum is often assumed, but most emitters do not have blackbody spectra.

    A Big Bang emitter at 3000 K is thus postulated with an after-glow received as a blackbody spectrum of 3 K with frequency reduced and wave length increased by a factor of 1000 into far-infrared. 

    What is effectively measured is a combination of temperature and intensity, which shows up as a perfect blackbody spectrum. The message is that this is an after-glow of Big Bang, thus giving evidence to Big Bang: If there is an after-glow there must have been some glow to start with = Big Bang. More precisely, it is variations letting the antenna sweep the sky, which are measured and have to be given a physical meaning as some variability of the Early Universe. 

    The basic idea is thus that photons have been traveling through empty space for 14 billion years under a stretching of a factor 1000 but no other influence, and that collecting these photons gives a picture of the Early Universe. This appears as a lofty  speculation cleverly designed as to prevent inspection because both theory and instrumentation are hidden in mist. Here is the main picture from The Music of the Big Bang by Amedo Balbi: 


    The source is thus gone since 14 billion years, while the after-glow still surrounds us and can be measured. This is mind boggling. 

    Let us compare with the picture presented as Computational Blackbody Radiation, where emitter and receiver establish contact by resonance of electromagnetic waves and so take on the same temperature by reaching radiative equilibrium, in the same way as two distant tune forks can find an equilibrium.

    What about the time delay between emitter and receiver from finite speed of light? If a light source is switched on, it will take some time before it reaches a receiver. Is it the same when a light source is switched off? Do you feel being warmed even a while after the fire is dead? What about a solar eclipse? Does it take 8 minutes before we feel the cold? 

    In any case, the connection between Big Bang which is gone since 14 billion years and a proclaimed after-glow, which we can enjoy today from the presence of about 400 photons in every cubic centimetre of space at 3 K, appears as science fiction to me at least. 

    Radiation as electromagnetic waves needs a source to sustain over time. If the Big Bang source to CMB disappeared 14 billion years ago, the electromagnetic waves have so to speak have a life of their own over very long time, like a tsunami wave sweeping the Pacific long after the earth quake source has disappeared. Here the ocean acts as a physical medium carrying the energy, while a corresponding medium for electromagnetic waves as an aether has no physical presence. The energy is thus carried by the source some of which is transmitted to the receiver in resonance. 


    måndag 15 april 2024

    Modern Physics vs Homeopathy

    Modern physics appears as a form of homeopathy in reverse. A main idea of homeopathy is to obtain a major health effect from a very very diluted form of some substance, the smaller the better. The characteristic of modern physics, as rather the opposite, is identification of a very small effect from a very very large cause as in the following key examples  with increasing presence in later years (with year of Nobel Prize in Physics):

    • Theoretical and experimental discovery of very small deviation from Newton's mechanics in Einstein's mechanics. (no Prize)
    • Theoretical discovery of the Pauli Exclusion Principle impossible to verify experimentally. (1945)
    • Theoretical discovery of statistical interpretation of wave function impossible to verify experimentally. (1954)
    • Experimental discovery of Microwave Background. (1978)
    • Experimental discovery of very small fluctuation in temperature of Cosmic Microwave Background Radiation from Big Bang. (2006)
    • Theoretical discovery of broken symmetry predicting quarks impossible to verify experimentally. (2008)
    • Experimental discovery of accelerating expansion of the Universe from very weak data. (2011)
    • Experimental discovery of Higg's particle as origin of mass from very weak data. (2013)
    • Experimental discovery of very weak gravitational waves from collision of supernovae. (2017)
    • Theoretical discovery that black hole is a prediction of general relativity. (2020)
    • Theoretical discovery that global warming is a prediction of very little atmospheric CO2. (2021)
    • Theoretical discovery of string theory on very very small scales impossible to verify experimentally. (no Prize)
    It may seem that all notable effects have already been discovered and so only very very small remain to be discovered. The difficulty of connecting a very small effect to a very large cause (or vice versa) is that a very precise theory is needed in order to build a very precise instrument for experimental verification. Without theory experiment has no objective. Finding a needle in a haystack may be simpler. In addition to experimental discoveries of some vanishingly small effect, we also see Prizes to discoveries of theories beyond experimental verification because effects are so small.  

    When the Large Hadron Collider shows to be too small to find anything new of significance and public money for an even larger Super Collider cannot be harvested, physicists turn to use the whole Universe as test bench to find ever smaller effects. There are many things yet to be discovered on scales allowing detection, but this draws little interest from leading physicists focussed on what is infinitely large or infinitesimally small.  

    We may compare with the evaluation by Napoleon of the work in his administration of the mathematician Laplace as expert on Infinitesimal Calculus: 
    • He wanted to bring the spirit of infinitesimals into administration.



    söndag 14 april 2024

    Cosmic Microwave Background Radiation vs Big Bang?

    This is a continuation of a previous post on the same topic. The European Space Agency ESA sends this message to the people of Europe and the World:

    • The Cosmic Microwave Background (CMB) is the cooled remnant of the first light that could ever travel freely throughout the Universe.
    • Scientists consider it as an echo or 'shockwave' of the Big Bang. Over time, this primeval light has cooled and weakened considerably; nowadays we detect it in the microwave domain.
    More precisely, CMB is reported to be measured by the FIRAS Far Infrared Absolute Spectrophotometer (FIRAS) instrument on the COBE satellite as a very small temperature variation (18 $\mu K$) over a uniform background of a perfect blackbody spectrum at 2.725 $K$. The main difficulty is to isolate a very weak signal from very far away from more nearby signals including signals from the Earth atmosphere and oceans.  

    To understand the technology of the measurement, which is not easy, we take a look at the FIRAS instrument to see what it contains:


     What we see is in particular the following:
    • Sky Horn collecting input from the Sky.
    • Xcal reference blackbody used for calibration of Sky Horn input.
    • Ical reference blackbody for internal calibration.
    • Ical is equipped with two germanium resistance thermometers (GRT).
    • Xcal is monitored by three GRTs.
    • FIRAS = Far Infrared Absolute Spectrophotometer.
    The output of FIRAS consists of:
    • A very small temperature variation of size 0.00002 K over a background of 2.725 K.
    • The measured background spectrum is a perfect Planck blackbody spectrum. 
    CMB spectrum as perfect Planck blackbody spectrum. But low frequencies in the far infrared spectrum are missing! 

    We see warning signs: 
    • Very high precision is reported!
    • Perfect Planck blackbody spectrum is reported. But far infrared is missing. 
    • Calibration to nearly perfect real blackbodies is made. 
    • Temperature of 3 K from very far reported.  
    • Spectrum as radiative flux is reported (spectrophotometer).
    More understanding comes from plotting the spectrum in terms of frequency:


    We here see the COBE-FIRAS (blue) measures intensity at maximum around 200 GHz and a bit beyond for higher frequencies in the cut-off region, while the more essential part of the spectrum in the far infrared is missing. The intensity maximum around 200GHz according to Planck's law corresponds to a temperature of about 3 K, which however, since the essential part of the spectrum is missing, may as well correspond to much higher temperature at much lower emissivity.

    In previous posts we have reminded that measurement of temperature is possible by establishing radiative equilibrium between source and instrument/thermometer, but it requires disturbances between source and instrument to be small, which poses a challenge to directly measuring temperature of CMB from very far. 

    The alternative in this case is to report temperature from spectrum. But directly measuring radiative flux/spectrum can be even more challenging. Typically this is done (using bolometers and pyrometers) by measuring temperature, and then computing radiative flux/spectrum using Planck's law under assumptions hard to verify. This makes assessing CMB to a very daunting task from a mix of measurement and computation of temperature and radiative flux.

    The scenario is thus:
    • If a correct full spectrum is measured, a temperature can be determined from the frequency of maximal intensity. 
    • If only temperature is given, determining spectrum as radiative flux intensity, requires post processing. 
    • A measured/computed temperature of 3K attributed to a very far away source may be misleading.
    • Robitaille suggesting that the true origin of the the 3 K CMB is the oceans of the Earth at 300 K.  
    To sum up, we have on the table: 
    1. Very speculative Big Bang BB.
    2. CMB with questionable credibility, maybe noise from Earth Ocean,  
    The argument by mainstream physicists/cosmologists is now that since the main role of CMB is to serve as main evidence of Big Bang, and CMB shows to serve this role in such an excellent way, it gives credibility to CMB by being connected to something very big. BB thus supports CMB, which gives support to BB. 

    One possibility is then that both BB and CMB are real phenomena The other possibility is that both are free speculations by scientists in search of a mission. What is your impression? 

    PS Has COBE-FIRAS detected the same thing as WMAP and PLANCK further away from the Earth:


    Which picture is most credible? The more details, the more credible? What happens with small details over time according to the 2nd Law?