tisdag 25 mars 2014

Quantum Physics as Digital Continuum Physics

Quantum mechanics was born in 1900 in Planck's theoretical derivation of a modification of Rayleigh-Jeans law of blackbody radiation based on statistics of discrete "quanta of energy" of size $h\nu$, where $\nu$ is frequency and $h =6.626\times 10^{-34}\, Js$ is Planck's constant.

This was the result of a long fruitless struggle to explain the observed spectrum of radiating bodies using deterministic eletromagnetic wave theory, which ended in Planck's complete surrender to statistics as the only way he could see to avoid the "ultraviolet catastrophe" of infinite radiation energies, in a return to the safe haven of his dissertation work in 1889-90 based on Boltzmann's statistical theory of heat.

Planck described the critical step in his analysis of a radiating blackbody as a discrete collection of resonators as follows:
  • We must now give the distribution of the energy over the separate resonators of each frequency, first of all the distribution of the energy $E$ over the $N$ resonators of frequency . If E is considered to be a continuously divisible quantity, this distribution is possible in infinitely many ways. 
  • We consider,  however this is the most essential point of the whole calculation $E$ to be composed of a well-defined number of equal parts and use thereto the constant of nature $h = 6.55\times 10^{-27}\, erg sec$. This constant multiplied by the common frequency of the resonators gives us the energy element in $erg$, and dividing $E$ by we get the number $P$ of energy elements which must be divided over the $N$ resonators. 
  • If the ratio thus calculated is not an integer, we take for $P$ an integer in the neighbourhood. It is clear that the distribution of P energy elements over $N$ resonators can only take place in a finite, well-defined number of ways.
We here see Planck introducing a constant of nature $h$, later referred to as Planck's constant, with a corresponding smallest quanta of energy $h\nu$ for radiation (light) of frequency $\nu$. 

Then Einstein entered in 1905 with a law of photoelectricity with $h\nu$ viewed as the energy of a light quanta of frequency $\nu$ later named photon and crowned as an elementary particle.

Finally, in 1926 Schrödinger formulated a wave equation for involving a formal momentum operator  $-ih\nabla$ including Planck's constant $h$, as the birth of quantum mechanics, as the incarnation of modern physics based on postulating that microscopic physics is
  1. "quantized" with smallest quanta of energy $h\nu$,
  2. indeterministic with discrete quantum jumps obeying laws of statistics.
However, microscopics based on statistics is contradictory, since it requires microscopics of microscopics in an endeless regression, which has led modern physics into an impasse of ever increasing irrationality into manyworlds and string theory as expressions of scientific regression to microscopics of microscopics. The idea of "quantization" of the microscopic world goes back to the atomism of Democritus, a primitive scientific idea rejected already by Aristotle arguing for the continuum, which however combined with modern statistics has ruined physics.  

But there is another way of avoiding the ultraviolet catastrophe without statistics, which is presented on Computational Blackbody Radiation with physics viewed as analog finite precision computation which can be modeled as digital computational simulation

This is physics governed by deterministic wave equations with solutions evolving in analog computational processes, which can be simulated digitally. This is physics without microscopic games of roulette as rational deterministic classical physics subject only to natural limitations of finite precision computation.

This opens to a view of quantum physics as digital continuum physics which can bring rationality back to physics. It opens to explore an analog physical atomistic world as a digital simulated world where the digital simulation reconnects to analog microelectronics. It opens to explore physics by exploring the digital model, readily available for inspection and analysis in contrast to analog physics hidden to inspection.

The microprocessor world is "quantized" into discrete processing units but it is a deterministic world with digital output:

1 kommentar:

  1. since it requires microscopics of microscopics in an endeless regression

    I don't really see what this has to do with quantum mechanics? On a microscopic scale, a quantum mechanical state evolves deterministically. There are theories of a mechanism that generates the probabilistic behavior (decoherence).

    I get the feeling that you have done a misinterpretation of the basic theory here.