måndag 31 mars 2014

Planck's Constant = Human Convention Standard Frequency vs Electronvolt


The recent posts on the photoelectric effect exhibits Planck's constant $h$ as a conversion standard between the units of light frequency $\nu$ in $Hz\, = 1/s$ as periods per second and electronvolt ($eV$), expressed in Einstein's law of photoelectricity:
  • $h\times (\nu -\nu_0) = eU$, 
where $\nu_0$ is smallest frequency producing a photoelectric current, $e$ is the charge of an electron and $U$ the stopping potential in Volts $V$ for which the current is brought to zero for $\nu > \nu_0$. Einstein obtained, referring to Lenard's 1902 experiment with $\nu -\nu_0 = 1.03\times 10^{15}\, Hz$ corresponding to the ultraviolet limit of the solar spectrum and $U = 4.3\, V$ 
  • $h = 4.17\times 10^{-15} eVs$
to be compared with the reference value $4.135667516(91)\times 10^{-15}\, eV$ used in Planck's radiation law. We see that here $h$ occurs as a conversion standard between Hertz $Hz$ and electronvolt $eV$  with 
  • $1\, Hz  = 4.17\times 10^{-15}\, eV$ 
To connect to quantum mechanics, we recall that Schrödinger's equation is normalized with $h$ so that the first ionization energy of Hydrogen at frequency $\nu = 3.3\times 10^{15}\, Hz$ equals $13.6\, eV$, to be compared with $3.3\times 4.17 = 13.76\, eV$ corresponding to Lenard's photoelectric experiment. 

We understand that Planck's constant $h$ can be seen as a conversion standard between light energy measured by frequency and electron energy measured in electronvolts. The value of $h$ can then be determined by photoelectricity and thereafter calibrated into Schrödinger's equation to fit with ionization energies as well as into Planck's law as a parameter in the high-frequency cut-off (without a very precise value).  The universal character of $h$ as a smallest unit of action is then revealed to simply be a human convention standard without physical meaning. What a disappointment!

  • Planck's constant was introduced as a fundamental scale in the early history of quantum mechanics. We find a modern approach where Planck's constant is absent: it is unobservable except as a constant of human convention.
Finally: It is natural to view frequency $\nu$ as a measure of energy per wavelength, since radiance as energy per unit of time scales with $\nu\times\nu$ in accordance with Planck's law, which can be viewed as $\nu$ wavelengths each of energy $\nu$ passing a specific location per unit of time. We thus expect to find a linear relation between frequency and electronvolt as two energy scales: If 1 € (Euro) is equal to 9 Skr (Swedish  Crowns), then 10 € is equal to 90 Skr.

söndag 30 mars 2014

Photoelectricity: Millikan vs Einstein















The American physicist Robert Millikan received the Nobel Prize in 1923 for (i) experimental determination of the charge $e$ of an electron and (ii) experimental verification of Einstein's law of photoelectricity awarded the 1921 Prize.

Millikan started out his experiments on photoelectricity with the objective of disproving Einstein's law and in particular the underlying idea of light quanta. To his disappointment Millikan found that according to his experiments Einstein's law in fact was valid, but he resisted by questioning the conception of light-quanta even in his Nobel lecture:  
  • In view of all these methods and experiments the general validity of Einstein’s equation is, I think, now universally conceded, and to that extent the reality of Einstein’s light-quanta may be considered as experimentally established. 
  • But the conception of localized light-quanta out of which Einstein got his equation must still be regarded as far from being established. 
  • Whether the mechanism of interaction between ether waves and electrons has its seat in the unknown conditions and laws existing within the atom, or is to be looked for primarily in the essentially corpuscular Thomson-Planck-Einstein conception as to the nature of radiant energy is the all-absorbing uncertainty upon the frontiers of modern Physics.
Millikan's experiments consisted in subjecting a metallic surface to light of different frequencies $\nu$ and measuring the resulting photovoltic current determining a smallest frequency $\nu_0$ producing a current and (negative) stopping potential required to bring the current to zero for frequencies $\nu >\nu_0$. Millikan thus measured $\nu_0$ and $V$ for different frequencies $\nu > \nu_0$ and found a linear relationship between $\nu -\nu_0$ and $V$, which he expressed as
  • $\frac{h}{e}(\nu -\nu_0)= V$,     
in terms of the charge $e$ of an electron which he had already determined experimentally, and the constant $h$ which he determined to have the value $6.57\times 10^{-34}$. The observed linear relation between $\nu -\nu_0$ and $V$ could then be expressed as
  • $h\nu = h\nu_0 +eV$    
which Millikan had to admit was nothing but Einstein's law with $h$ representing Planck's constant. 

But Millikan could argue that, after all, the only thing he had done was to establish a macroscopic linear relationship between $\nu -\nu_0$ and $V$, which in itself did not give undeniable evidence of the existence of microscopic light-quanta. What Millikan did was to measure the current for different potentials of the plus pole receiving the emitted electrons under different exposure to light and thereby discovered a linear relationship between frequency $\nu -\nu_0$ and stopping potential $V$ independent of the intensity of the light and properties of the metallic surface.  

By focussing on frequency and stopping potential Millikan could make his experiment independent of the intensity of incoming light and of the metallic surface, and thus capture a conversion between light energy and electron energy of general significance.  

But why then should stopping potential $V$ scale with frequency $\nu - \nu_0$, or $eV$ scale with frequency $h(\nu - \nu_0)$? Based on the analysis on Computational Blackbody Radiation the answer would be that $h\nu$ represents a threshold energy for emission of radiation in Planck's radiation law and $eV$ represents a threshold energy for emission of electrons, none of which would demand light quanta.

   

lördag 29 mars 2014

Einstein: Genius by Definition of Law of Photoelectricity


Einstein opened to the new brave world of modern physics in two articles in his 1905 annus mirabilis, one giving humanity a breath-taking entirely new view on space and time through the special theory of relativity, and the other on photoelectricity introducing light quanta carried by light particles later named photons preparing the development of quantum mechanics.

Einstein's science is difficult to understand because it is never clear if the basic postulates of his theories are definitions without physics content, that is tautologies which are true by semantic construction, or if they are statements about physics which may be true or not true depending on realities.

The special theory of relativity is based on a postulate that the speed of light (in vacuum) is the same for all observers independent of motion with constant velocity. With the new definition of length scale of a lightsecond to be used by all observers, the speed of light for all observers is equal to one lightsecond per second and thus simply a definition or agreement between different observers.

Yet physicists by training firmly believe that the speed light is constant as physical fact behind the definition. For Einstein and all modern physicists following in his footsteps, definition and statement about physics come together into one postulate of relativity which can flip back and forth between definition and statement about physics and thereby ruin any attempt to bring clarity in a scientific discussion. Einstein played this game masterfully by formulating special relativity as a prescription or definition or dictate that different observers are to coordinate observations by Lorentz transformation. A dictate cannot be false. It can only be disastrous.

Let us now check if Einstein's law of photoelectricity, which gave him the 1921 Nobel Prize in Physics is also a definition and thus empty of physics content. The law takes the form
  • $h(\nu -\nu_0) =eV$,  
which expresses an energy balance for one electron of charge $e$ being ejected from a certain metallic surface by incoming light of frequency $\nu$ with $\nu_0$ the smallest frequency for which any electrons are ejected and $V$ is the potential required to stop a current of electrons for $\nu > \nu_0$. The relation can be written 
  • $h\nu = h\nu_0 + eV$
expressing a balance of incoming energy $h\nu$ as release energy $h\nu_0$ and electron (kinetic) energy after ejection $eV$ measured by the stopping potential $V$.  

There is one more parameter in the energy balance and that is $h$, which is Planck's constant.
Measuring the stopping potential $V$ for light of different frequencies $\nu$ including determining $\nu_0$ and finding a linear relationship between $\nu -\nu_0$ and $V$, would then allow the determination of a value of $h$ making the law true. This shows to work and is in fact a standard way of experimentally determining the value of Planck's constant $h$. 

In this perspective Einstein's law of photoelectricty comes out as a definition through which the value of $h$ is determined, which effectively corresponds to a conversion standard from the dimension of Joule of $h\nu$ as light energy to the dimension of electronvolt of $eV$ as electron energy, which says nothing about the existence of discrete packets of energy or light quanta.  

The physics enters only in the assumed linear relation between $\nu$ and $V$. From the derivation of Planck's law on Computational Blackbody Radiation it is clear that $h\nu$ in the  high-frequency cut-off factor $\frac{\alpha}{\exp(\alpha )-1}$ with $\alpha=\frac{h\nu}{kT}$ in Planck's law, acts as a threshold value, that is as a certain quantity $h\nu$ of energy per atomic energy $kT$ required for emission of radiation. This strongly suggests a linear relationship between $\nu$ and $V$ since $V$ also serves as a threshold.

We thus conclude that the general form of Einstein's law of photoelectricity as a linear relationship in an energy balance for each electron between the frequency of incoming light $\nu$ and the stopping potential $V$, naturally comes out from the role of $h\nu$ as threshold value modulo $kT$. 

Once the linear relationship is postulated as physics, the value of $h$ to make the law fit with observation is a matter of definition as effectively determining energy conversion between light energy as $h\nu$ in Joule and electron energy as $eV$ in electronvolt. The quantity $h\nu$ is then a threshold value and not a discrete packet of energy and $\frac{h}{e}$ sets an exchange rate between two different currencies of frequency and stopping potential.

In other words, Einstein received the Nobel Prize for formulating a definition almost empty of physics content. It shows that the concept of a photon as a light particle carrying the discrete packet of energy $h\nu$ is also a definition empty of physics content. 

Another aspect emerging from the above analysis is an expected (and observed) temperature dependence of photoelectricity, which is not expressed in Einstein's law. The release energy is expected to depend on temperature and there is no reason to expect that the stopping potential should compensate so as to make determination of $h$ by photoelectricity independent of temperature.  What is needed is then an extension of Einstein's law to include dependence on temperature.

It remains to sort out the appearance of the parameter $h$ (determined by photoelectricity) in Planck's radiation law and in Schrödinger's equation, which has already been touched in a previous post, but will be addressed in more detail in an upcoming post.

The advantage of using definitions as postulates about physics is that you can be absolutely sure that your physics is correct (but empty). This aspect came out when Einstein confronted with an observation claimed to contradict special relativity, with absolute confidence could say that the observation was wrong:
  • If the facts don't fit the theory, change the facts.
  • Whether you can observe a thing or not depends on the theory which you use.
  • It is the theory which decides what we can observe.
  • What I'm really interested in is whether God could have made the world in a different way; that is, whether the necessity of logical simplicity leaves any freedom at all.
In this form of physics what you see depends on the glasses you put on and not on what you are looking at. In this form of physics the observer decides if Schödinger's cat is dead or alive by the mere act of looking at the cat, and not the cat itself even if it has nine lives.

PS1 To view $h\nu$ as a packet of energy carried by a photon is non-physical and confusing for several reasons, one being that radiation intensity as energy per unit of time scales as $\nu^2$ and thus the scaling as $\nu$ of photon energy is compensated by a flow of photons per unit time scaling as $\nu$, with each photon occupying a half wave length. 

PS2 If now Einstein is a genius by definition, there is as little reason to question that as questioning that there are 100 centimeters on a meter.


torsdag 27 mars 2014

How to Make Schrödinger's Equation Physically Meaningful + Computable


The derivation of Schrödinger's equation as the basic mathematical model of quantum mechanics is hidden in mystery: The idea is somehow to start considering a classical Hamiltonian $H(q,p)$ as the total energy equal to the sum of kinetic and potential energy:
  • $H(q,p)=\frac{p^2}{2m} + V(q)$,
where $q(t)$ is position and $p=m\dot q= m\frac{dq}{dt}$ momentum of a moving particle of mass $m$, and make the formal ad hoc substitution with $\bar h =\frac{h}{2\pi}$ and $h$ Planck's constant:
  • $p = -i\bar h\nabla$ with formally $\frac{p^2}{2m} = - \frac{\bar h^2}{2m}\nabla^2 = - \frac{\bar h^2} {2m}\Delta$, 
to get Schrödinger's equation in time dependent form 
  • $ i\bar h\frac{\partial\psi}{\partial t}=H\psi $,
with now $H$ a differential operator acting on a wave function $\psi (x,t)$ with $x$ a space coordinate and $t$ time, given by 
  • $H\psi \equiv -\frac{\bar h^2}{2m}\Delta \psi + V\psi$,
where now $V(x)$ acts as a given potential function. As a time independent eigenvalue problem Schrödinger's equation then takes the form:
  •  $-\frac{\bar h^2}{2m}\Delta \psi + V\psi = E\psi$,
with $E$ an eigenvalue, as a stationary value for the total energy
  • $K(\psi ) + W(\psi )\equiv\frac{\bar h^2}{2m}\int\vert\nabla\psi\vert^2\, dx +\int V\psi^2\, dx$, 
as the sum of kinetic energy $K(\psi )$ and potential energy $W(\psi )$, under the normalization $\int\psi^2\, dx = 1$.  The ground state then corresponds to minimal total energy,

We see that the total energy $K(\psi ) + W(\psi)$ can be seen as smoothed version of $H(q,p)$ with
  • $V(q)$ replaced by $\int V\psi^2\, dx$,
  • $\frac{p^2}{2m}=\frac{m\dot q^2}{2}$ replaced by  $\frac{\bar h^2}{2m}\int\vert\nabla\psi\vert^2\, dx$,
and Schrödinger's equation as expressing stationarity of the total energy as an analog the classical equations of motion expressing stationarity of the Hamiltonian $H(p,q)$ under variations of the path $q(t)$. 

We conclude that Schrödinger's equation for a one electron system can be seen as a smoothed version of the equation of motion for a classical particle acted upon by a potential force, with Planck's constant serving as a smoothing parameter. 

Similarly it is natural to consider smoothed versions of classical many-particle systems as quantum mechanical models resembling Hartree variants of Schrödinger's equation for many-electrons systems, that is quantum mechanics as smoothed particle mechanics, thereby (maybe) reducing some of the mystery of Schrödinger's equation and opening to computable quantum mechanical models.

We see Schrödinger's equation arising from a Hamiltonian as total energy kinetic energy + potential energy, rather that from a Lagrangian as kinetic energy - potential energy. The reason is a confusing terminology with $K(\psi )$ named kinetic energy even though it does not involve time differentiation, while it more naturally should occur in a Lagrangian as a form of the potential energy like elastic energy in classical mechanics.  

onsdag 26 mars 2014

New Paradigm of Computational Quantum Mechanics vs ESS

ESS as European Spallation Source is a €3 billion projected research facility captured by clever Swedish politicians to be allocated to the plains outside the old university town Lund in Southern Sweden with start in 2025: Neutrons are excellent for probing materials on the molecular level – everything from motors and medicine, to plastics and proteins. ESS will provide around 30 times brighter neuutron beams than existing facilities today. The difference between the current neutron sources and ESS is something like the difference between taking a picture in the glow of a candle, or doing it under flash lighting.

Quantum mechanics was invented in the 1920s under limits of pen and paper computation but allowing limitless theory thriving in Hilbert spaces populated by multidimensional wave functions described by fancy symbols on paper. Lofty theory and sparse computation was compensated by inflating the observer role of the physicist to a view that only physics observed by a physicist was real physics, with extra support from a conviction that the life or death of Schrödinger's cat depended more on the observer than on the cat and that supercolliders are very expensive. The net result was (i) uncomputable limitless theory combined with (ii) unobservable practice as the essence of the Copenhagen Interpretation filling text books.

Today the computer opens to a change from impossibility to possibility, but this requires a fundamental change of the mathematical models from uncomputable to computable non-linear systems of 3d of Hartree-Schrödinger equations (HSE) or Density Functional Theory (DFT). This brings theory and computation together into a new paradigm of Computational Quantum Mechanics CQM shortly summarized as follows:
  1. Experimental inspection of microscopic physics difficult/impossible.
  2. HSE-DFT for many-particle systems are solvable computationally. 
  3. HSE-DFT simulation allows detailed inspection of microscopics.
  4. Assessment of HSE simulations can be made by comparing macroscopic outputs with observation. 
The linear multidimensional Schrödinger equation has no meaning in CQM and a new foundation is asking to be developed. The role of observation in the Copenhagen Interpretation is taken over by computation in CQM: Only computable physics is real physics, at least if physics is a form of analog computation, which may well be the case. The big difference is that anything computed can be inspected and observed, which opens to non-destructive testing with only limits set by computational power.

The Large Hadron Collider (LHC) and the projected neutron collider European Spallation Source (ESS) in Lund in Sweden represent the old paradigm of smashing to pieces the fragile structure under investigation and as such may well be doomed.

tisdag 25 mars 2014

Fluid Turbulence vs Quantum Electrodynamics

Horace Lamb (1849 - 1934) author of the classic text HydrodynamicsIt is asserted that the velocity of a body not acted on by any force will be constant in magnitude and direction, whereas the only means of ascertaining whether a body is, or is not, free from the action of force is by observing whether its velocity is constant.

There is famous quote by the British applied mathematician Horace Lamb summarizing the state of classical fluid mechanics and the new quantum mechanics in 1932 as follows:
  • I am an old man now, and when I die and go to heaven there are two matters on which I hope for enlightenment. One is quantum electrodynamics, and the other is the turbulent motion of fluids. And about the former I am rather optimistic.
Concerning the turbulent motion of fluids I am happy to report that this matter is now largely resolved by computation, as made clear in the article New Theory of Flight soon to be delivered for publication in Journal of Mathematical Fluid Mechanics, with lots of supplementary material on The Secret of Flight. This gives good hope that the other problem of quantum electrodynamics can likewise be unlocked by viewing  The World  as Computation:
  • In a time of turbulence and change, it is more true than ever that knowledge is power. (JFK)

Quantum Physics as Digital Continuum Physics


Quantum mechanics was born in 1900 in Planck's theoretical derivation of a modification of Rayleigh-Jeans law of blackbody radiation based on statistics of discrete "quanta of energy" of size $h\nu$, where $\nu$ is frequency and $h =6.626\times 10^{-34}\, Js$ is Planck's constant.

This was the result of a long fruitless struggle to explain the observed spectrum of radiating bodies using deterministic eletromagnetic wave theory, which ended in Planck's complete surrender to statistics as the only way he could see to avoid the "ultraviolet catastrophe" of infinite radiation energies, in a return to the safe haven of his dissertation work in 1889-90 based on Boltzmann's statistical theory of heat.

Planck described the critical step in his analysis of a radiating blackbody as a discrete collection of resonators as follows:
  • We must now give the distribution of the energy over the separate resonators of each frequency, first of all the distribution of the energy $E$ over the $N$ resonators of frequency . If E is considered to be a continuously divisible quantity, this distribution is possible in infinitely many ways. 
  • We consider,  however this is the most essential point of the whole calculation $E$ to be composed of a well-defined number of equal parts and use thereto the constant of nature $h = 6.55\times 10^{-27}\, erg sec$. This constant multiplied by the common frequency of the resonators gives us the energy element in $erg$, and dividing $E$ by we get the number $P$ of energy elements which must be divided over the $N$ resonators. 
  • If the ratio thus calculated is not an integer, we take for $P$ an integer in the neighbourhood. It is clear that the distribution of P energy elements over $N$ resonators can only take place in a finite, well-defined number of ways.
We here see Planck introducing a constant of nature $h$, later referred to as Planck's constant, with a corresponding smallest quanta of energy $h\nu$ for radiation (light) of frequency $\nu$. 

Then Einstein entered in 1905 with a law of photoelectricity with $h\nu$ viewed as the energy of a light quanta of frequency $\nu$ later named photon and crowned as an elementary particle.

Finally, in 1926 Schrödinger formulated a wave equation for involving a formal momentum operator  $-ih\nabla$ including Planck's constant $h$, as the birth of quantum mechanics, as the incarnation of modern physics based on postulating that microscopic physics is
  1. "quantized" with smallest quanta of energy $h\nu$,
  2. indeterministic with discrete quantum jumps obeying laws of statistics.
However, microscopics based on statistics is contradictory, since it requires microscopics of microscopics in an endeless regression, which has led modern physics into an impasse of ever increasing irrationality into manyworlds and string theory as expressions of scientific regression to microscopics of microscopics. The idea of "quantization" of the microscopic world goes back to the atomism of Democritus, a primitive scientific idea rejected already by Aristotle arguing for the continuum, which however combined with modern statistics has ruined physics.  

But there is another way of avoiding the ultraviolet catastrophe without statistics, which is presented on Computational Blackbody Radiation with physics viewed as analog finite precision computation which can be modeled as digital computational simulation

This is physics governed by deterministic wave equations with solutions evolving in analog computational processes, which can be simulated digitally. This is physics without microscopic games of roulette as rational deterministic classical physics subject only to natural limitations of finite precision computation.

This opens to a view of quantum physics as digital continuum physics which can bring rationality back to physics. It opens to explore an analog physical atomistic world as a digital simulated world where the digital simulation reconnects to analog microelectronics. It opens to explore physics by exploring the digital model, readily available for inspection and analysis in contrast to analog physics hidden to inspection.

The microprocessor world is "quantized" into discrete processing units but it is a deterministic world with digital output:




måndag 24 mars 2014

Hollywood vs Principle of Least Action

The fictional character of the Principle of Least Action viewed to serve a fundamental role in physics,  can be understood by comparing with making movies:


The dimension of action as energy x time comes out very naturally in movie making as actor energy x length of the scene.  However, outside Hollywood a quantity of dimension energy x time is questionable from physical point of view, since there seems to be no natural movie camera which can record and store such a quantity.   

söndag 23 mars 2014

Why the Same Universal Quantum of Action $h$ in Radiation, Photoelectricity and Quantum Mechanics?


Planck's constant $h$ as The Universal Quantum of Action was introduced by Planck in 1900 as a mathematical statistical trick to supply the classical Rayleigh-Jeans radiation law $I(\nu ,T)=\gamma T\nu^2$ with a high-frequency cut-off factor $\theta (\nu ,T)$ to make it fit with observations including Wien's displacement law, where
  • $\theta (\nu ,T) =\frac{\alpha}{\exp(\alpha )-1}$,
  • $\alpha =\frac{h\nu}{kT}$, 
$\nu$ is frequency, $T$ temperature in Kelvin $K$ and $k =1.38066\times 10^{-23}\, J/K$ is Boltzmann's constant and $\gamma =\frac{2k}{c}$ with $c\, m/s$ the speed of light in vaccum. Planck then determined $h$ from experimental radiation spectra to have a value of $6.55\times 10^{-34} Js$, as well as Boltzmann's constant to be $1.346\times 10^{-23}\, J/K$ with $\frac{h}{k}= 4.87\times 10^{-11}\, Ks$ as the effective parameter in the cut-off.  

Planck viewed $h$ as a fictional mathematical quantity without real physical meaning, with $h\nu$ a fictional mathematical quantity as a smallest packet of energy of a wave of frequency $\nu$, but in 1905 the young ambitious Einstein suggested an energy balance for photoelectricity of the form 
  • $h\nu = W + E$,
with $W$ the energy required to release one electron from a metallic surface and E the energy of a released electron with $h\nu$ interpreted as the energy of a light photon of frequency $\nu$ as a discrete lump of energy. Since the left hand side $h\nu$ in this law of photoelectricity was determined by the value of $h$ in Planck's radiation law, a new energy measure for electrons of electronvolt was defined by the relation $W + E =h\nu$. As if by magic the same Universal Quantum of Action $h$ then appeared to serve a fundamental role in both radiation and photoelectricity.

What a wonderful magical coincidence that the energy of a light photon of frequency $\nu$ showed to be exactly $h\nu \, Joule$! In one shot Planck's fictional smallest quanta of energy $h\nu$ in the hands of the young ambitious Einstein had been turned into a reality as the energy of a light photon of frequency $h\nu$, and of course because a photon carries a definite packet of energy a photon must be real. Voila!

In 1926 Planck's constant $h$ showed up again in a new context, now in Schrödinger's equation
  • $-\frac{\bar h^2}{2m}\Delta\psi = E\psi$
 with the formal connection   
  • $p = -i\bar h \nabla$ with $\bar h =\frac{h}{2\pi}$,
  • $\frac{\vert p\vert^2}{2m} = E$, 
as a formal analog of the classical expression of kinetic energy $\frac{\vert p\vert ^2}{2m}$ with $p=mv$ momentum, $m$ mass and $v$ velocity.

Planck's constant $h$ originally determined to make theory fit with observations of radiation spectra and then by Planck in 1900 canonized as The Universal Quantum of Action, thus in 1905 served to attribute the energy $h\nu$ to the new fictional formal quantity of a photon of frequency $\nu$ . In 1926 a similar formal connection was made in the formulation of Schrödinger's wave equation.  

The result is that the same Universal Quantum of Action $h$ by all modern physicists is claimed to play a fundamental role in both (i) radiation, (ii) photolelectricity and (iii) quantum mechanics of the atom. This is taken as an expression of a deep mystical one-ness of physics which only physicists can grasp,  while it in fact it is a play with definitions without mystery, where $h$ appears as a parameter in a high-frequency cut-off factor in Planck's Law, or rather in the combination $\hat h =\frac{h}{k}$,  and then is transferred into (ii) and (iii) by definition.  Universality can this way be created by human hands by definition. The power of thinking has no limitations, or cut-off.

No wonder that Schrödinger had lifelong interest in the Vedanta philosophy of Hinduism "played out on one universal consciousness".

But Einstein's invention of the photon as light quanta in 1905 haunted him through life and approaching the end in 1954, he confessed:
  • All these fifty years of conscious brooding have brought me no nearer to the answer to the question, "What are light quanta?" Nowadays every Tom, Dick and Harry thinks he knows it, but he is mistaken. 
Real physics always shows up to be more interesting than fictional physics, cf. Dr Faustus ofd Modern Physics.

PS Planck's constant $h$ is usually measured by (ii) and is then transferred to (i) and (iii) by ad hoc definition.

The Torturer's Dilemma vs Uncertainty Principle vs Computational Simulation


Bohr expressed in Light and Life (1933) the Thantalogical Principle stating that to check out the nature of something, one has to destroy that very nature, which we refer to as The Torturer's Dilemma:
  • We should doubtless kill an animal if we tried to carry the investigations of its organs so far that we could describe the role played by single atoms in vital functions. In every experiment on living organisms, there must remain an uncertainty as regards the physical conditions to which they are subjected…the existence of life must be considered as an elementary fact that cannot be explained, but must be taken as a starting point in biology, in a similar way as the quantum of action, which appears as an irrational element from the point of view of classical mechanics, taken together with the existence of the elementary particles, forms the foundation of atomic physics.
  • It has turned out, in fact, that all effects of light may be traced down to individual processes, in which a so-calles light quantum is exchanged, the energy of which is equal to the product of the frequency of the electromagnetic oscillations and the universal quantum of action, or Planck's constant. The striking contrast between this atomicity of the light phenomenon and the continuity of of the energy transfer according to the electromagnetic theory, places us before a dilemma of a character hitherto unknwown in physics.
Bohr's starting point for his "Copenhagen" version of quantum mechanics still dominating text books, was:
  • Planck's discovery of the universal quantum of action which revealed a feature of wholeness in individual atomic processes defying casual description in space and time.
  • Planck's discovery of the universal quantum of action taught us that the wide applicability of the accustomed description of the behaviour of matter in bulk rests entirely on the circumstance that the action involved in phenomena on the ordinary scale is so large that the quantum can be completely neglected.  (The Connection Between the Sciences, 1960)
Bohr thus argued that the success of the notion of universal quantum of action depends on the fact that in can be completely neglected. 

The explosion of digital computation since Bohr's time offers a new way of resolving the impossibility of detailed inspection of microscopics, by a allowing detailed non-invasive inspection of computational simulation of microscopics. With this perspective efforts should be directed to development of computable models of microscopics, rather than smashing high speed protons or neutrons into innocent atoms in order to find out their inner secrets, without getting reliable answers. 




lördag 22 mars 2014

The True Meaning of Planck's Constant as Measure of Wavelength of Maximal Radiance and Small-Wavelength Cut-off.


The modern physics of quantum mechanics was born in 1900 when Max Planck after many unsuccessful attempts in an "act of despair" introduced a universal smallest quantum of action  $h= 6.626\times 10^{-34}\, Js = 4.12\times 10^{-15}\, eVs$ named Planck's constant in a theoretical justification of the spectrum of radiating bodies observed in experiments, based on statistics of packets of energy of size $h\nu$ with $\nu$ frequency.

Planck describes this monumental moment in the history of science in his 1918 Nobel Lecture as follows:
  • For many years, such an aim for me was to find the solution to the problem of the distribution of energy in the normal spectrum of radiating heat.
  • Nevertheless, the result meant no more than a preparatory step towards the initial onslaught on the particular problem which now towered with all its fearsome height even steeper before me. The first attempt upon it went wrong…
  • So there was nothing left for me but to tackle the problem from the opposite side, that of thermodynamics, in which field I felt, moreover, more confident. 
  • Since the whole problem concerned a universal law of Nature, and since at that time, as still today, I held the unshakeable opinion that the simpler the presentation of a particular law of Nature, the more general it is… 
  • For this reason, I busied myself, from then on, that is, from the day of its establishment, with the task of elucidating a true physical character for the formula, and this problem led me automatically to a consideration of the connection between entropy and probability, that is, Boltzmann's trend of ideas; until after some weeks of the most strenuous work of my life, light came into the darkness, and a new undreamed-of perspective opened up before me.
Planck thus finally succeeded to prove Planck's radiation law as a modification of Rayleigh-Jeans law with a high-frequency cut-off factor eliminating "the ultraviolet catastrophe" which had paralyzed physics shortly after the introduction of Maxwell's wave equations for electromagnetics as the culmination of classical physics.

Planck's constant $h$ enters Planck's law
  • $I(\nu ,T)=\gamma \theta (\nu , T)\nu^2 T$, where $\gamma =\frac{2k}{c^2}$,
where $I(\nu ,T)$ is normalized radiance, as a parameter in the multiplicative factor
  • $\theta (\nu ,T)=\frac{\alpha}{e^{\alpha} -1}$, 
  • $\alpha=\frac{h\nu}{kT}$,
where $\nu$ is frequency, $T$ temperature in Kelvin $K$ and $k = 1.38\times 10^{-23}\, J/K = 8.62\times 10^{-5}\, eV/K$ is Boltzmann's constant and $c\, m/s$ the speed of light.

We see that $\theta (\nu ,T)\approx 1$ for small $\alpha$ and enforces a high-frequency small-wavelength cut-off for $\alpha > 10$, that is, for   
  • $\nu > \nu_{max}\approx \frac{10T}{\hat h}$ where $\hat h =\frac{h}{k}=4.8\times 10^{-11}\, Ks$,
  • $\lambda < \lambda_{min}\approx \frac{c}{10T}\hat h$ where $\nu\lambda =c$,
with maximal radiance occuring for $\alpha = 2.821$ in accordance with Wien's displacement law.  With $T = 1000\, K$ the cut-off is in the visible range for $\nu\approx 2\times 10^{14}$ and $\lambda\approx 10^{-6}\, m$. We see that the relation 
  • $\frac{c}{10T}\hat h =\lambda_{min}$,
gives $\hat h$ a physical meaning as measure of wave-length of maximal radiance and small-wavelength cut-off of atomic size scaling with $\frac{c}{T}$. 

Modern physicsts are trained to believe that Planck's constant $h$ as the universal quantum of action represents a smallest unit of a "quantized" world with a corresponding Planck length $l_p= 1.62\times 10^{-35}$ as a smallest unit of length, about 20 orders of magnitude smaller than the proton diameter.

We have seen that Planck's constant enters in Planck's radiation law in the form $\hat h =\frac{h}{k}$, and not as $h$, and that $\hat h$ has the role of setting a small-wavelength cut-off scaling with $\frac{c}{T}$.

Small-wavelength cut-off in the radiation from a body is possible to envision in wave mechanics as an expression of finite precision analog computation. In this perspective Planck's universal quantum of action emerges as unnecessary fiction about exceedingly small quantities beyond reason and reality.



torsdag 20 mars 2014

Principle of Least Action vs Adam Smith's Invisible Hand

                                     Violation of the PLA of the capitalistic system in 1929.

The Principle of Least Action (PLA) expressing
  • Stationarity of the Action (the integral in time of the Lagrangian), 
with the Lagrangian the difference between kinetic and potential energies, is cherished by physicists as a deep truth about physics: Tell me the Lagrangian and I will tell you the physics, because a dynamical system will (by reaction to local forces) evolve so as to keep the Action stationary as if led by an invisible hand steering the system towards a final cause of least action.

PLA is similar to the invisible hand of Adam Smith supposedly steering an economy towards a final cause of maximal effectivity or least action (maximal common happiness) by asking each member of the economy to seek to maximize individual profit (individual happiness). This is the essence of the capitalistic system. The idea is that a final cause of maximal effectivity can be reached without telling the members the meaning of the whole thing, just telling each one to seek to maximize his/her own individual profit (happiness).

Today the capitalistic system is shaking and nobody knows how to steer towards a final cause of maximal efficiency. So the PLA of economy seems to be rather empty of content. It may be that similarly the PLA of physics is void of real physics. In particular, the idea of a smallest quantum of action as a basis of quantum mechanics, may well be unphysical.

  

Till Per-Anders Ivert Redaktör för SMS-Bulletinen

Jag har skickat följande inlägg till Svenska Matematikersamfundets medlemsblad Bulletinen med anledning av redaktör Per-Anders Iverts inledande ord i februarinummret 2014.

Till SMS-Bulletinen

Redaktör Per-Anders Ivert inleder februarinummret av Bulletinen med: "Apropå reaktioner; det kommer sällan sådana, men jag uppmärksammades på en rolig reaktion på något jag skrev för några nummer sedan och som handlade om huruvida skolmatematiken behövs. En jeppe från Chalmers, en person som jag inte känner och tror att jag aldrig varit i kontakt med, skrev på sin blogg":
  • Oktobernumret av Svenska Matematikersamfundets Bulletin tar upp frågan om skolmatematiken ”behövs”. 
  • Ordförande Per- Anders Ivert inleder med Själv kan jag inte svara på vad som behövs och inte behövs. Det beror på vad man menar med ”behövs” och även på hur skolmatematiken ser ut.
  • Ulf Persson följer upp med en betraktelse som inleds med: Det tycks vara ett faktum att en stor del av befolkningen avskyr matematik och finner skolmatematiken plågsam. 
  • Ivert och Persson uttrycker den vilsenhet, och därav kommande ångest, som präglar matematikerns syn på sitt ämnes roll i skolan av idag: Yrkesmatematikern vet inte om skolmatematiken längre ”behövs” och då vet inte skolmatematikern och eleven det heller. 
Ivert fortsätter med:
  • "När jag såg detta blev jag rätt förvånad. Jag trodde att mina citerade ord var fullkomligt okontroversiella, och jag förstod inte riktigt vad som motiverade sarkasmen ”ordförande”. Den här Chalmersliraren trodde nog inte att jag var ordförande för Samfundet, utan det ska väl föreställa någon anspelning på östasiatiska politiska strukturer". 
  • "Vid en närmare läsning såg jag dock att Ulf Persson hade kritiserat den här bloggaren i sin text, vilket tydligen hade lett till en mental kortslutning hos bloggaren och associationerna hade börjat gå kors och tvärs. Om man vill fundera över min ”vilsenhet och ångest” så bjuder jag på en del underlag i detta nummer".
Iverts utläggning om "jeppe på Chalmers" och "Chalmerslirare" skall ses mot bakgrund av det öppna brev till Svenska Matematikersamfundet of Nationalkommitten för Matematik, som jag publicerade på min blogg 22 dec 2013, och där jag efterfrågade vilket ansvar Samfundet och Kommitten har för matematikundervisningen i landet, inklusive skolmatematiken och det pågående Matematiklyftet.

Trots ett flertal påminnelser har jag inte fått något svar varken från Samfundet (ordf Pär Kurlberg) eller Kommitten (Torbjörn Lundh) eller KVA-Matematik (Nils Dencker), och jag ställer denna fråga än en gång nu direkt till Dig Per-Anders Ivert: Om Du och Samfundet inte har drabbats av någon "vilsenhet och ångest" så måste Du kunna ge ett svar och publicera detta tillsammans med detta mitt inlägg i nästa nummer av Bulletinen.

Med anledning av Ulf Perssons inlägg under Ordet är mitt, kan man säga att det som räknas vad gäller kunskap är skillnad i kunskap: Det alla kan har ringa intresse. En skola som främst satsar på att ge alla en gemensam baskunskap, vad den än må vara, har svårt att motivera eleverna och är förödande både de många som inte uppnår de gemensamma målen och för de något färre som skulle kunna prestera mycket bättre. Sålänge Euklidisk geometri och latin var reserverade för liten del av eleverna, kunde motivation skapas och studiemål uppnås,  tämligen oberoende av intellektuell kapacitet och social bakgrund hos elever (och lärare). Matematiklyftet som skall lyfta alla, är ett tomt slag i luften till stora kostnader.

Epiteten om min person i Bulletinen har nu utvidgats från "Johnsonligan" till "jeppe på Chalmers" och "Chalmerslirare", det senare kanske inte längre så aktuellt då jag flyttade till KTH för 7 år sedan. Per-Anders ondgör sig över språklig förflackning, men där ingår uppenbarligen inte "jeppe", "lirare" och "mental kortslutning".

Claes Johnson
prof em i tillämpad matematik KTH

onsdag 19 mars 2014

Lagrange's Biggest Mistake: Least Action Principle Not Physics!

The reader will find no figures in this work. The methods which I set forth do not require either constructions or geometrical or mechanical reasonings: but only algebraic operations, subject to a regular and uniform rule of procedure. (Preface to Mécanique Analytique)

The Principle of Least Action formulated by Lagrange in his monumental treatise Mecanique Analytique (1811) collecting 50 years work, is viewed to be the crown jewel of the Calculus of Newton and Leibniz as the mathematical basis of the scientific revolution:
  • The equations of motion of a dynamical system are the same equations that express that the action as the integral over time of the difference of kinetic and potential energies, is stationary that is does not change under small variations.   
The basic idea goes back to Leibniz:
  • In change of motion, the quantity of action takes on a Maximum or Minimum. 
And to Maupertis (1746):
  • Whenever any action occurs in nature, the quantity of action employed in this change is the least possible.
In mathematical terms, the Principle of Least Action expresses that the trajectory $u(t)$ followed by a dynamical system over a given time interval $I$ with time coordinate $t$, is determined by the condition of stationarity of the action:
  • $\frac{d}{d\epsilon}\int_I(T(u(t)+\epsilon v(t)) - V(u(t)+\epsilon v(t)))\, dt =0$,  
where $T(u(t))$ is kinetic energy and $V(u(t))$ is potential energy of $u(t)$ at time $t$, and $v(t)$ is an arbitrary perturbation of $u(t)$,  combined with an initial condition. In the basic case of a harmonic oscillator;
  • $T(u(t))=\frac{1}{2}\dot u^2(t)$ with $\dot u=\frac{du}{dt}$,
  • $V(u(t))=\frac{1}{2}u^2(t)$
  • stationarity is expressed as force balance as Newton's 2nd law: $\ddot u (t) +u(t) = 0$.  
The Principle of Least Action is viewed as a constructive way of deriving the equations of motion expressing force balance according to Newton's 2nd law, in situations with specific choices of coordinates for which direct establishment of the equations is tricky. 

From the success in this respect the Principle of Least Action has been elevated from mathematical trick to physical principle asking Nature to arrange itself so as to keep the action stationary, as if Nature could compare the action integral for different trajectories and choose the trajectory with least action towards a teleological final cause, while in fact Nature can only respond to forces as expressed in equations of motion.

But if Nature does not have the capability of evaluating and comparing action integrals, it can be misleading to think this way. In the worst case it leads to invention of physics without real meaning, which is acknowledged by Lagrange in the Preface to Mecanique Analytique.

The ultimate example is the very foundation of quantum physics as the pillar of modern physics based on a concept of elementary (smallest) quantum of action  denoted by $h$ and named Planck's constant with dimension $force \times time$. Physicists are trained to view the elementary quantum of action to represent a "quantization" of reality expressed as follows on Wikipedia:
  • In physics, a quantum (plural: quanta) is the minimum amount of any physical entity involved in an interaction. 
  • Behind this, one finds the fundamental notion that a physical property may be "quantized," referred to as "the hypothesis of quantization".This means that the magnitude can take on only certain discrete values.
  • A photon is a single quantum of light, and is referred to as a "light quantum".
In the quantum world light consists of a stream of discrete light quanta named photons. Although Einstein in his 1905 article on the photoelectric effect found it useful as a heuristic idea to speak about light quanta, he later changed mind:
  • The quanta really are a hopeless mess. (to Pauli)
  • All these fifty years of conscious brooding have brought me no nearer to the answer to the question, 'What are light quanta?' Nowadays every Tom, Dick and Harry thinks he knows it, but he is mistaken. (1954)
But nobody listened to Einstein and there we are today with an elementary quantum of action which is viewed as the basis of modern physics but has not physical reality. Schrödinger supported by Einstein said:
  • There are no particles or quanta. All is waves.
Connecting to the previous post, note that to compute a solution according the Principle of Least Action typically an iterative method based on relaxation of the equations of motion is used, which has a physical meaning as response to imbalance of forces. This shows the strong connection between computational mathematics as iterative time-stepping and analog physics as motion in time subject to forces, which can be seen as a mindless evolution towards a hidden final cause, as if directed by an invisible hand of a mind understanding the final cause.   

Physics as Analog Computation instead of Physics as Observation

















Bohr plotting the Copenhagen Interpretation of quantum mechanics together with Heisenberg and Pauli (left) and Bohr wondering what he did 30 years later (right).

To view physics as a form of analog computation which can be simulated by digital computation offers resolutions of the following main unsolved problems of modern microscopic and classical macroscopic  physics:
  1. Interaction between subject (experimental apparatus) and object under observation.
  2. Meaning of smallest quantum of action named Planck's constant $h$.
  3. Contradiction between particle and wave qualities. Particle-wave duality.
  4. Meaning of the 2nd law of thermodynamics and direction of time.
  5. Meaning of Heisenberg's Uncertainity Principle.
  6. Loss of cause-effect relation by resort of microscopic statistics.
  7. Statistical interpretation of Schrödinger's multidimensional wave function.
  8. Meaning of Bohr's Complementarity Principle. 
  9. Meaning of Least Action Principle. 
This view is explored on The World as Computation and Computational Blackbody Radiation suggesting the following answers to these basic problems:
  1. Observation by digital simulation is possible without interference with physical object.
  2. Planck's constant $h$ can be viewed as a computational mesh size parameter.
  3. All is wave. There are no particles. No particle-wave duality.
  4. Dissipation as an effect of finite precision computation gives a 2nd law and direction of time.
  5. Uncertainty Principle as effect of finite precision computation.
  6. Statistics replaced by finite precision computation.
  7. Schrödinger's wave equation as system in 3d without statistical interpretation.
  8. No contradiction between complementary properties. No need of Complementarity Principle.
  9. Least Action Principle as computational mathematical principle without physical reality. 
The textbook physics harboring the unsolved problems is well summarized by Bohr:
  • There is no quantum world. There is only an abstract quantum physical description. It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature…
  • Everything we call real is made of things that cannot be regarded as real. If quantum mechanics hasn't profoundly shocked you, you haven't understood it yet.
  • We are all agreed that your theory is crazy. The question which divides us is whether it is crazy enough to have a chance of being correct. My own feeling is that it is not crazy enough. 
  • We must be clear that when it comes to atoms, language can be used only as in poetry. The poet, too, is not nearly so concerned with describing facts as with creating images and establishing mental connections.

tisdag 18 mars 2014

Blackbody as Linear High Gain Amplifier

                                      A blackbody acts as a high gain linear (black) amplifier.

The analysis on Computational Blackbody Radiation (with book) shows that a radiating body can be seen as a linear high gain amplifier with a high-frequency cut-off scaling with noise temperature, modeled by a wave equation with small damping, which after Fourier decomposition in space takes the form of a damped linear oscillator for each wave frequency $\nu$:
  • $\ddot u_\nu +\nu^2u_\nu - \gamma\dddot u_\nu = f_\nu$,
where $u_\nu(t)$ is oscillator amplitude and $f_\nu (t)$ signal amplitude of wave frequency $\nu$ with $t$ time, the dot indicates differentiation with respect to $t$, and $\gamma$ is a small constant satisfying $\gamma\nu^2 << 1$ and the frequency is subject to a cut-off of the form $\nu < \frac{T_\nu}{h}$, where 
  • $T_\nu =\overline{\dot u_\nu^2}\equiv\int_I \dot u_\nu^2(t)\, dt$, 
is the (noise) temperature of frequency of $\nu$, $I$ a unit time interval and $h$ is a constant representing a level of finite precision.  

The analysis shows under an assumption of near resonance, the following basic relation in stationary state:
  •   $\gamma\overline{\ddot u_\nu^2} \approx \overline{f_\nu^2}$,
as a consequence of small damping guiding $u_\nu (t)$ so that  $\dot u_\nu(t)$ is out of phase with $f_\nu(t)$ and thus "pumps" the system little. The result is that the signal $f_\nu (t)$ is balanced to major part by the oscillator
  • $\ddot u_\nu +\nu^2u_\nu$, 
 and to minor part by the damping
  • $ - \gamma\dddot u_\nu$,
because 
  • $\gamma^2\overline{\dddot u_\nu^2} \approx \gamma\nu^2 \gamma\overline{\ddot u_\nu^2}\approx\gamma\nu^2\overline{f_\nu^2} <<\overline{f_\nu^2}$. 
This means that the blackbody can be viewed to act as an amplifier radiating the signal $f_\nu$ under the small input $-\gamma \dddot u_\nu$, thus with a high gain. The high frequency cut-off then gives a requirement on the temperature $T_\nu$, referred to as noise temperature, to achieve high gain.   

Quantum Mechanics from Blackbody Radiation as "Act of Despair"


Max Planck: The whole procedure was an act of despair because a theoretical interpretation (of black-body radiation) had to be found at any price, no matter how high that might be…I was ready to sacrifice any of my previous convictions about physics..For this reason, on the very first day when I formulated this law, I began to devote myself to the task of investing it with true physical meaning.

The textbook history of modern physics tells that quantum mechanics was born from Planck's proof of the universal law of blackbody radiation based on an statistics of discrete lumps of energy or energy quanta $h\nu$, where $h$ is Planck's constant and $\nu$ frequency. The textbook definition of a blackbody is a body which absorbs all, reflects none and re-emits all of incident radiation:
  • A black body is an idealized physical body that absorbs all incident electromagnetic radiation, regardless of frequency or angle of incidence. (Wikipedia)
  • "Blackbody radiation" or "cavity radiation" refers to an object or system which absorbs all radiation incident upon it and re-radiates energy which is characteristic of this radiating system only, not dependent upon the type of radiation which is incident upon it. (Hyperphysics)
  • Theoretical surface that absorbs all radiant energy that falls on it, and radiates electromagnetic energy at all frequencies, from radio waves to gamma rays, with an intensity distribution dependent on its temperature. (Merriam-Webster)
  • An ideal object that is a perfect absorber of light (hence the name since it would appear completely black if it were cold), and also a perfect emitter of light. (Astro Virginia)
  • A black body is a theoretical object that absorbs 100% of the radiation that hits it. Therefore it reflects no radiation and appears perfectly black. (Egglescliff)
  • A hypothetic body that completely absorbs all wavelengths of thermal radiation incident on it. (Eric Weisstein's World of Physics
But there is something more to a blackbody and that is a the high frequency cut-off, expressed in Wien's displacement law, of the principal form 
  • $\nu < \frac{T}{\hat h}$,
where $\nu$ is frequency, $T$ temperature  and $\hat h$ a Planck constant, stating that only frequencies below the cut-off $\frac{T}{\hat h}$ are re-emitted. Absorbed frequencies above the cut-off will then be stored as internal energy in the body under increasing temperature,

Bodies which absorb all incident radiation made of different materials will have different high-frequency cut-off and an (ideal) blackbody should then be characterized as having maximal cut-off, that is smallest Planck constant $\hat h$, with the maximum taken over all real bodies. 

A cavity with graphite walls is used as a reference blackbody defined by the following properties: 
  1. absorption of all incident radiation
  2. maximal cut-off - smallest Planck constant $\hat h\approx 4.8\times 10^{-11}\, Ks$,     
and $\hat h =\frac{h}{k}$ is Planck's constant $h$ scaled by Boltzmann's constant $k$. 

Planck viewed the high frequency cut-off defined by the Planck constant $\hat h$ to be inexplicable in Maxwell's classical electromagnetic wave theory. In an "act of despair" to save physics from collapse in an "ultraviolet catastrophe", a role which Planck had taken on,  Planck then resorted to statistics of discrete energy quanta $h\nu$, which in the 1920s resurfaced as a basic element of quantum mechanics. 

But a high frequency cut-off in wave mechanics is not inexplicable, but is a well known phenomenon in all forms of waves including elastic, acoustic and electromagnetic waves, and can be modeled as a disspiative loss effect, where high frequency wave motion is broken down into chaotic motion stored as internal heat energy. For details, see Computational Blackbody Radiation.

It is a mystery why this was not understood by Planck. Science created in an "act of despair" runs the risk of being irrational and flat wrong, and that is if anything the trademark of quantum mechanics based on discrete quanta. 

Quantum mechanics as deterministic wave mechanics may be rational and understandable. Quantum mechanics as statistics of quanta is irrational and confusing. All the troubles and mysteries of quantum mechanics emanate from the idea of discrete quanta. Schrödinger had the solution:
  • I insist upon the view that all is waves.
  • If all this damned quantum jumping were really here to stay, I should be sorry I ever got involved with quantum theory.
But Schrödinger was overpowered by Bohr and Heisenberg, who have twisted the brains of modern physicists with devastating consequences... 

måndag 17 mars 2014

Unphysical Combination of Complementary Experiments


Let us take a look at how Bohr in his famous 1927 Como Lecture  describes complementarity as a fundamental aspect of Bohr's Copenhagen Interpretation still dominating textbook presentations of  quantum mechanics:
  • The quantum theory is characterised by the acknowledgment of a fundamental limitation in the classical physical ideas when applied to atomic phenomena. The situation thus created is of a peculiar nature, since our interpretation of the experimental material rests essentially upon the classical concepts.
  • Notwithstanding the difficulties which hence are involved in the formulation of the quantum theory, it seems, as we shall see, that its essence may be expressed in the so-called quantum postulate, which attributes to any atomic process an essential discontinuity, or rather individuality, completely foreign to the classical theories and symbolised by Planck's quantum of action.
OK, we learn that quantum theory is based on a quantum postulate about an essential discontinuity symbolised as Planck's constant $h=6.626\times 10^{-34}\, Js$ as a quantum of action. Next we read about necessary interaction between the phenomena under observation and the observer:   
  • Now the quantum postulate implies that any observation of atomic phenomena will involve an interaction with the agency of observation not to be neglected. 
  • Accordingly, an independent reality in the ordinary physical sense can neither be ascribed to the phenomena nor to the agencies of observation.
  • The circumstance, however, that in interpreting observations use has always to be made of theoretical notions, entails that for every particular case it is a question of convenience at what point the concept of observation involving the quantum postulate with its inherent 'irrationality' is brought in.
Next, Bohr emphasizes the contrast between the quantum of action and classical concepts:
  • The fundamental contrast between the quantum of action and the classical concepts is immediately apparent from the simple formulas which form the common foundation of the theory of light quanta and of the wave theory of material particles. If Planck's constant be denoted by $h$, as is well known: $E\tau = I \lambda = h$where $E$ and $I$ are energy and momentum respectively, $\tau$ and $\lambda$  the corresponding period of vibration and wave-length. 
  • In these formulae the two notions of light and also of matter enter in sharp contrast. 
  • While energy and momentum are associated with the concept of particles, and hence may be characterised according to the classical point of view by definite space-time co-ordinates, the period of vibration and wave-length refer to a plane harmonic wave train of unlimited extent in space and time.
  • Just this situation brings out most strikingly the complementary character of the description of atomic phenomena which appears as an inevitable consequence of the contrast between the quantum postulate and the distinction between object and agency of measurement, inherent in our very idea of observation.
Bohr clearly brings out the unphysical aspects of the basic action formula
  • $E\tau = I \lambda = h$,  
where energy $E$ and momentum $I$ related to particle are combined with period $\tau$ and wave-length $\lambda$ related to wave.

Bohr then seeks to resolve the contradiction by naming it complementarity as an effect of interaction between instrument and object:
  • Consequently, evidence obtained under different experimental conditions cannot be comprehended within a single picture, but must be regarded as complementary in the sense that only the totality of the phenomena exhausts the possible information about the objects. 
  • In quantum mechanics, however, evidence about atomic objects obtained by different experimental arrangements exhibits a novel kind of complementary relationship. 
  • … the notion of complementarity simply characterizes the answers we can receive by such inquiry, whenever the interaction between the measuring instruments and the objects form an integral part of the phenomena. 
Bohr's complementarity principle has been questioned by many over the years:
  • Bohr’s interpretation of quantum mechanics has been criticized as incoherent and opportunistic, and based on doubtful philosophical premises. (Simon Saunders)
  • Despite the expenditure of much effort, I have been unable to obtain a clear understanding of Bohr’s principle of complementarity (Einstein).
Of course an object may have complementary qualities such as e.g. color and weight, which can be measured in different experiments, but it is meaningless to form a new concept as color times weight or colorweight and then desperately seek to give it a meaning. 

In the New View presented on Computational Blackbody Radiation the concept of action as e.g position times velocity has a meaning in a threshold condition for dissipation, but is not a measure of a quantity which is carried by a physical object such as mass and energy.

The ruling Copenhagen interpretation was developed by Bohr contributing a complementarity principle and Heisenberg contributing a related uncertainty principle based position times momentum (or velocity) as Bohr's unphysical complementary combination. The uncertainty principle is often expressed as a lower bound on the product of weighted norms of a function and its Fourier transform, and then interpreted as combat between localization in space and frequency or between particle and wave. In this form of the uncertainty principle the unphysical aspect of a product of position and frequency is hidden by mathematics.

The Copenhagen Interpretation was completed by Born's suggestion to view (the square of the modulus of) Schrödinger's wave function as a probability distribution for particle configuration, which in the absence of something better became the accepted way to handle the apparent wave-particle contradiction, by viewing it as a combination of probability wave with particle distribution.     


New Uncertainty Principle as Wien's Displacement Law


The recent series of posts based on Computational Blackbody Radiation suggest that Heisenberg's Uncertainty Principle can be understood as a consequence of Wien's Displacement Law expressing high-frequency cut-off in blackbody radiation scaling with temperature according to Planck's radiation law:
  • $B_\nu (T)=\gamma\nu^2T\times \theta(\nu ,T)$,
where $B_\nu (T)$ is radiated energy per unit frequency, surface area, viewing angle and second, $\gamma =\frac{2k}{c^2}$ where $k = 1.3806488\times  10^{-23} m^2 kg/s^2 K$ is Boltzmann's constant and $c$ the speed of light in $m/s$, $T$ is temperature in Kelvin $K$,
  • $\theta (\nu ,T)=\frac{\alpha}{e^\alpha -1}$, 
  • $\alpha=\frac{h\nu}{kT}$,
where $\theta (\nu ,T)\approx 1$ for $\alpha < 1$ and $\theta (\nu ,T)\approx 0$ for $\alpha > 10$ as high frequency cut-off with $h=6.626\times 10^{-34}\, Js$ Planck's constant. More precisely, maximal radiance for a given temperature occurs $T$ for $\alpha \approx 2.821$ with corresponding frequency
  • $\nu_{max} = 2.821\frac{T}{\hat h}$ where $\hat h=\frac{h}{k}=4.8\times 10^{-11}\, Ks$, 
with a rapid drop for $\nu >\nu_{max}$.

The proof of Planck's Law in Computational Blackbody Radiation explains the high frequency cut-off as a consequence of finite precision computation introducing a dissipative effect damping high-frequencies.

A connection to Heisenbergs Uncertainty Principle can be made by noting that a high-frequency cut-off condition of the form
  • $\nu < \frac{T}{\hat h}$,
can be rephrased in the following form connecting to Heisenberg's Uncertainty Principle:
  • $u_\nu\dot u_\nu > \hat h$                  (New Uncertainty Principle)
where $u_\nu$ is position amplitude, $\dot u_\nu =\nu u_\nu$ is velocity amplitude of a wave of frequency $\nu$ with $\dot u_\nu^2 =T$.

The New Uncertainty Principle expresses that observation/detection of a wave, that is observation/detection of amplitude $u$ and frequency $\nu =\frac{\dot u}{u}$ of a wave, requires
  • $u\dot u>\hat h$.
The New Uncertainty Principle concerns observation/detection amplitude and frequency as physical aspects of wave motion, and not as Heisenberg's Uncertainty Principle particle position and wave frequency as unphysical complementary aspects.

söndag 16 mars 2014

Uncertainty Principle, Whispering and Looking at a Faint Star


The recent series of posts on Heisenberg's Uncertainty Principle based on Computational Blackbody Radiation suggests the following alternative equivalent formulations of the principle:
  1. $\nu < \frac{T}{\hat h}$,
  2. $u_\nu\dot u_\nu > \hat h$,
where $u_\nu$ is position amplitude, $\dot u_\nu =\nu u_\nu$ is velocity amplitude of a wave of frequency $\nu$ with $\dot u_\nu^2 =T$, and $\hat h =4.8\times 10^{-11}Ks$ is Planck's constant scaled with Boltzmann's constant.

Here, 1 represents Wien's displacement law stating that the radiation from a body is subject to a frequency limit scaling with temperature $T$ with the factor $\frac{1}{\hat h}$.

2 is superficially similar to Heisenberg's Uncertainty Principle as an expression of the following physics: In order to detect a wave of amplitude $u$, it is necessary that the frequency $\nu$ of the wave satisfies $\nu u^2>h$. In particular, if the amplitude $u$ is small, then the frequency $\nu$ must be large.

This connects to (i) communication by whispering and (ii) viewing a distant star, both being based on the possibility of detecting small amplitude high-frequency waves.

The standard presentation of Heisenberg's Uncertainty Principle is loaded with contradictions:
  • The uncertainty principle is certainly one of the most famous and important aspects of quantum mechanics. 
  • But what is the exact meaning of this principle, and indeed, is it really a principle of quantum mechanics? And, in particular, what does it mean to say that a quantity is determined only up to some uncertainty? 
  • So the question may be asked what alternative views of the uncertainty relations are still viable.
  • Of course, this problem is intimately connected with that of the interpretation of the wave function, and hence of quantum mechanics as a whole. 
  • Since there is no consensus about the latter, one cannot expect consensus about the interpretation of the uncertainty relations either. 
In other words, today there is no consensus on the meaning of Heisenberg's Uncertainty principle. The reason may be that it has no meaning, but that there is an alternative which is meaningful.

Notice in particular that the product of two complementary or conjugate variables such as position and momentum is questionable if viewed as representing a physical quantity, while as threshold it can make sense.

fredag 14 mars 2014

DN Debatt: Offentlighetsprincipen Vittrar Bort genom Plattläggningsparagrafer

Nils Funcke konstaterar på DN Debatt under Offentlighetsprincipen är på väg att vittra bort:
  • Den svenska offentlighetsprincipen nöts sakta men säkert ned.
  • ..rena plattläggningsparagrafer accepteras…
  • Vid EU inträdet 1995 avgav Sverige en deklaration: Offentlighetsprincipen, särskilt rätten att ta del av allmänna handlingar, och grundlagsskyddet för meddelarfriheten, är och förblir grundläggande principer som utgör en del av Sveriges konstitutionella, politiska och kulturella arv.
Ett exempel på plattläggningsparagraf är Högsta Förvaltningdomsstolens nya prejudikat:
  • För att en handling skall vara färdigställd och därmed vara upprättad och därmed vara allmän handling, krävs att någon åtgärd vidtas som visar att handlingen är färdigställd.
Med denna nya lagparagraf lägger HFD medborgaren platt på marken under myndigheten som nu själv kan bestämma om och när den åtgärd som enligt myndigheten krävs för färdigställande har vidtagits av myndigheten, eller ej.  

torsdag 13 mars 2014

Against Measurement Against Copenhagen: For Rationality and Reality by Computation


John Bell's Against Measurement is a direct attack onto the heart of quantum mechanics as expressed in the Copenhagen Interpretation according to Bohr:
  • It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature…
Bell poses the following questions:
  • What exactly qualifies some physical systems to play the role of "measurer"?
  • Was the wavefunction of the world waiting to jump for thousands of millions of years until a single-celled living creature appeared?
  • Or did it have to wait a little longer, for some better qualified system…with a Ph D? 
Physicists of today have no answers, with far-reaching consequences for all of science: If there is no rationality and reality in physics as the most rational and real of all sciences, then there can be no rationality and reality anywhere…If real physics is not about what is, then real physics is irrational and irreal…and then…any bubble can inflate to any size...

The story is well described by 1969 Nobel Laureate Murray Gell-Mann:
  • Niels Bohr brainwashed a whole generation of theorists into thinking that the job of interpreting quantum theory was done 50 years ago.
But there is hope today, in digital simulation which offers observation without interference. Solving Schrödinger's equation by computation gives information about physical states without touching the physics. It opens a road to bring physics back to the rationality of 19th century physics in the quantum nano-world of today…without quantum computing...  

Increasing Uncertainty about Heisenberg's Uncertainty Principle + Resolution

                My mind was formed by studying philosophy, Plato and that sort of thing….The reality we can put into words is never reality itself…The atoms or elementary particles themselves are not real; they form a world of potentialities or possibilities rather than one of things or facts...If we omitted all that is unclear, we would probably be left completely uninteresting and trivial tautologies...

The 2012 article Violation of Heisenberg’s Measurement-Disturbance Relationship by Weak Measurements  by Lee A. Rozema et al, informs us: 
  • The Heisenberg Uncertainty Principle is one of the cornerstones of quantum mechanics. 
  • In his original paper on the subject, Heisenberg wrote “At the instant of time when the position is determined, that is, at the instant when the photon is scattered by the electron, the electron undergoes a discontinuous change in momentum. This change is the greater the smaller the wavelength of the light employed, i.e., the more exact the determination of the position”. 
  • The modern version of the uncertainty principle proved in our textbooks today, however, deals not with the precision of a measurement and the disturbance it introduces, but with the intrinsic uncertainty any quantum state must possess, regardless of what measurement (if any) is performed. 
  • It has been shown that the original formulation is in fact mathematically incorrect.
OK, so we learn that Heisenberg's Uncertainty Principle (in its original formulation presumably) is a cornerstone of quantum physics, which however is mathematically incorrect, and that there is a modern version not concerned with measurement but with an intrinsic uncertainty of an quantum state regardless of measurement. In other words, a corner stone of quantum mechanics has been moved. 

  • The uncertainty principle (UP) occupies a peculiar position on physics. On the one hand, it is often regarded as the hallmark of quantum mechanics. 
  • On the other hand, there is still a great deal of discussion about what it actually says. 
  • A physicist will have much more difficulty in giving a precise  formulation than in stating e.g. the principle of relativity (which is itself not easy). 
  • Moreover, the formulation given by various physicists will differ greatly not only in their wording but also in their meaning.  
We learn that the uncertainty of the uncertainty principle has been steadily increasing ever since it was formulated by Heisenberg in 1927. 

In a recent series of posts based on Computational Blackbody Radiation I have suggested a new approach to the uncertainty principle as a high-frequency cut-off condition of the form
  • $\nu < \frac{T}{\hat h}$,  
where $\nu$ is frequency, $T$ temperature in Kelvin $K$ and $\hat h=4.8\times 10^{-11}Ks$ is a scaled Planck's constant, and the significance of the cut-off is that a body of temperature $T\, K$ cannot emit frequencies larger than $\frac{T}{h}$ because the wave synchronization required for emission is destroyed by internal friction damping these frequencies.  The cut-off condition thus expresses Wien's displacement law. 

The cut-off condition can alternatively be expressed as 
  • $u_\nu\dot u_\nu > \hat h$ 
where $u_\nu$ is amplitude and $\dot u_\nu =\frac{du_\nu}{dt}$ momentum of a wave of frequency $\nu$ with $\dot u_\nu^2 =T$ and $\dot u_\nu =\nu u_\nu$. We see that the cut-off condition has superficially a form similar to Heisenberg's uncertainty principle, but that the meaning is entirely different and in fact familiar as Wien's displacement law. 

We thus find that Heisenberg's uncertainty principle can be replaced by Wien's displacement law, which can be seen as an effect of internal friction preventing synchronization and thus emission of frequencies  $\nu > \frac{T}{\hat h}$.

The high-frequency cut-off condition with its dependence on temperature is similar to high-frequency damping of a loud speaker which can depend on the level of the sound. 

onsdag 12 mars 2014

Blackbody Radiation as Collective Vibration Synchronized by Resonance



There are two descriptions of the basic phenomenon of a radiation from a heated body (blackbody or greybody radiation) starting from a description of light as a stream of light particles named photons or as electromagnetic waves.

That the particle description of light is both primitive and unphysical was well understood before Einstein in 1905 suggested an explanation of the photoelectric effect based on light as a stream of particles later named photons, stimulated by Planck's derivation of Planck's law in 1900 based on radiation emitted in discrete quanta. However, with the development of quantum mechanics as a description of atomistic physics in the 1920s, the primitive and unphysical idea of light as a stream of particles was turned into a trademark of modern physics of highest insight.

The standpoint today is that light is both particle and wave, and the physicist is free to choose the description which best serves a given problem. In particular, the particle description is supposed to serve well to explain the physics of both blackbody radiation and photoelectricity. But since the particle description is primitive and unphysical, there must be something fishy about the idea that emission of radiation from a heated body results from emission of individual photons from individual atoms together forming a stream of photons leaving the body. We will return to the primitivism of this view after a study of the more educated idea of light as an (electromagnetic) wave phenomenon.

This more educated view is presented on Computational Blackbody Radiation with the following basic message:
  1. Radiation is a collective phenomenon generated from in-phase oscillations of atoms in a structured web of atoms synchronized by resonance.
  2. A radiating web of atoms acts like a system of tuning forks which tend to vibrate in phase as a result of resonance by acoustic waves. A radiating web of atoms acts like a swarm of cikadas singing in phase. 
  3. A radiating body has a high-frequency cut-off scaling with temperature of the form $\nu > \frac{T}{\hat h}$ with $\hat h = 4.8 \times 10^{-11}\, Ks$,where $\nu$ is frequency and $T$ temperature in degree Kelvin $K$, which translates to a wave-length $\lambda < \hat h\frac{c}{T}\, m$ as smallest correlation length for synchronization, where $c\, m/s$ is the speed of light. For $T =1500 K$ we get $\lambda \approx 10^{-5}\ m$ which is about 20 times the wave length of visible light.   
We can now understand that the particle view is primitive because it is unable to explain that the outgoing radiation consists of electromagnetic waves which are in-phase. If single atoms are emitting single photons there is no mechanism ensuring that corresponding particles/waves are in-phase, and so a most essential element is missing.

The analysis of Computational Blackbody Radiation shows that an ideal blackbody is characterized as a body which is (i) not reflecting and (ii) has a maximal high frequency cut-off. It is observed that the emission from a hole in a cavity with graphite walls is a realization of a blackbody. This fact can be understood as an effect of the regular surface structure of graphite supporting collective atom oscillations synchronized by resonance on an atomic surface web of smallest mesh size $\sim 10^{-9}$.