Visar inlägg med etikett uncertainty principle. Visa alla inlägg
Visar inlägg med etikett uncertainty principle. Visa alla inlägg

torsdag 30 november 2023

Uncertainty Principle vs Real Quantum Mechanics

Heisenberg’s Uncertainty Principle UP is a cornerstone of quantum mechanics stating that there is a limit to the precision both position and velocity of a particle like an electron, can be determined. The standard hand-waving argument is that precise measurement of position changes velocity and vice versa, and so precise measurement of both to arbitrary precision is impossible. This makes quantum mechanics fundamentally different from classical mechanics, where there is no such limit to measurement precision, in principle.

The other cornerstone is Schrödinger's wave function, which does not contain UP even if it is supposed to tell everything there is to say about the system. UP is thus an add-on to standard Quantum Mechanics stdQM based on Schrödinger’s equation, and then connected to measurement.

Is there any UP in RealQM? We recall that RealQM describes an atomic system as a collection of non-overlapping extended electronic charge densities which do not have particle character. For an extended body (on macro or microscale) there is no unique point position and velocity describing the state of the body and so there is a certain fuzziness or certain uncertainty depending on size if only one point, such as the center of gravity is to be used. RealQM thus, just like classical continuum mechanics for extended elastic bodies, comes with a form of UP depending on size, but this is perfectly normal and no mystery.

In RealQM there is no built-in limit to possible measure precision. The charge densities of an atom in ground state are stationary in space and shift around under radiation and so come with the usual fuzziness of extended bodies, which on microscale of course can be very real making it very difficult to point-wise determine the electron charge density in an atom, but there is no mystery like that of Heisenberg.

måndag 12 maj 2014

Why Feynman Said: Nobody Understands Quantum Mechanics


We have always had a great deal of difficulty understanding the world view that quantum mechanics represents. At least I do, because I'm an old enough man that I haven't got to the point that this stuff is obvious to me. Okay, I still get nervous with it.... I cannot define the real problem, therefore I suspect there's no real problem, but I'm not sure there's no real problem.

The (trivial) commutator relation  
  • $px - xp = ih$, 
where $x$ is the position (operator) and $p=\frac{h}{i}\frac{\partial}{\partial x}$ is the momentum (operator), is supposed to play a fundamental role in quantum mechanics,  in particular as the origin of Heisenberg's Uncertainty Principle:
  • $\sigma_x\sigma_p\ge \frac{h}{2}$,
where $\sigma_x$ is the standard deviation in measurements of position $x$, and $\sigma_p$ that of momentum. 

We see that both the commutator relation and Heisenberg's Uncertainty Principle concern the product of position and momentum. But such a product lacks physical meaning. Momentum $p$ has physical meaning and so has position $x$, but their product has no physical meaning.  

Momentum multiplied by velocity has a physical meaning as kinetic energy, but momentum multiplied by position does not. Force multiplied by velocity has a meaning as work.

Quantum mechanics is however obsessed with the product of momentum and position, with the message that because of the commutator relation they cannot both be determined at the same time and spot. The message is that this makes quantum mechanics fundamentally different from classical mechanics where supposedly momentum and position can both be determined.

There are two approaches to physics:
  1. Make it as simple and understandable as possible. 
  2. Make it as complicated and mysterioud as possible.     
Quantum mechanics has developed according to 2 as evidenced by Richard Feynman:
  • I think I can safely say that nobody understands quantum mechanics.
One reason is that the product of momentum and position is given an fundamental role in contradiction to the fact that it has no physical meaning. 

söndag 23 mars 2014

The Torturer's Dilemma vs Uncertainty Principle vs Computational Simulation


Bohr expressed in Light and Life (1933) the Thantalogical Principle stating that to check out the nature of something, one has to destroy that very nature, which we refer to as The Torturer's Dilemma:
  • We should doubtless kill an animal if we tried to carry the investigations of its organs so far that we could describe the role played by single atoms in vital functions. In every experiment on living organisms, there must remain an uncertainty as regards the physical conditions to which they are subjected…the existence of life must be considered as an elementary fact that cannot be explained, but must be taken as a starting point in biology, in a similar way as the quantum of action, which appears as an irrational element from the point of view of classical mechanics, taken together with the existence of the elementary particles, forms the foundation of atomic physics.
  • It has turned out, in fact, that all effects of light may be traced down to individual processes, in which a so-calles light quantum is exchanged, the energy of which is equal to the product of the frequency of the electromagnetic oscillations and the universal quantum of action, or Planck's constant. The striking contrast between this atomicity of the light phenomenon and the continuity of of the energy transfer according to the electromagnetic theory, places us before a dilemma of a character hitherto unknwown in physics.
Bohr's starting point for his "Copenhagen" version of quantum mechanics still dominating text books, was:
  • Planck's discovery of the universal quantum of action which revealed a feature of wholeness in individual atomic processes defying casual description in space and time.
  • Planck's discovery of the universal quantum of action taught us that the wide applicability of the accustomed description of the behaviour of matter in bulk rests entirely on the circumstance that the action involved in phenomena on the ordinary scale is so large that the quantum can be completely neglected.  (The Connection Between the Sciences, 1960)
Bohr thus argued that the success of the notion of universal quantum of action depends on the fact that in can be completely neglected. 

The explosion of digital computation since Bohr's time offers a new way of resolving the impossibility of detailed inspection of microscopics, by a allowing detailed non-invasive inspection of computational simulation of microscopics. With this perspective efforts should be directed to development of computable models of microscopics, rather than smashing high speed protons or neutrons into innocent atoms in order to find out their inner secrets, without getting reliable answers. 




måndag 17 mars 2014

Unphysical Combination of Complementary Experiments


Let us take a look at how Bohr in his famous 1927 Como Lecture  describes complementarity as a fundamental aspect of Bohr's Copenhagen Interpretation still dominating textbook presentations of  quantum mechanics:
  • The quantum theory is characterised by the acknowledgment of a fundamental limitation in the classical physical ideas when applied to atomic phenomena. The situation thus created is of a peculiar nature, since our interpretation of the experimental material rests essentially upon the classical concepts.
  • Notwithstanding the difficulties which hence are involved in the formulation of the quantum theory, it seems, as we shall see, that its essence may be expressed in the so-called quantum postulate, which attributes to any atomic process an essential discontinuity, or rather individuality, completely foreign to the classical theories and symbolised by Planck's quantum of action.
OK, we learn that quantum theory is based on a quantum postulate about an essential discontinuity symbolised as Planck's constant $h=6.626\times 10^{-34}\, Js$ as a quantum of action. Next we read about necessary interaction between the phenomena under observation and the observer:   
  • Now the quantum postulate implies that any observation of atomic phenomena will involve an interaction with the agency of observation not to be neglected. 
  • Accordingly, an independent reality in the ordinary physical sense can neither be ascribed to the phenomena nor to the agencies of observation.
  • The circumstance, however, that in interpreting observations use has always to be made of theoretical notions, entails that for every particular case it is a question of convenience at what point the concept of observation involving the quantum postulate with its inherent 'irrationality' is brought in.
Next, Bohr emphasizes the contrast between the quantum of action and classical concepts:
  • The fundamental contrast between the quantum of action and the classical concepts is immediately apparent from the simple formulas which form the common foundation of the theory of light quanta and of the wave theory of material particles. If Planck's constant be denoted by $h$, as is well known: $E\tau = I \lambda = h$where $E$ and $I$ are energy and momentum respectively, $\tau$ and $\lambda$  the corresponding period of vibration and wave-length. 
  • In these formulae the two notions of light and also of matter enter in sharp contrast. 
  • While energy and momentum are associated with the concept of particles, and hence may be characterised according to the classical point of view by definite space-time co-ordinates, the period of vibration and wave-length refer to a plane harmonic wave train of unlimited extent in space and time.
  • Just this situation brings out most strikingly the complementary character of the description of atomic phenomena which appears as an inevitable consequence of the contrast between the quantum postulate and the distinction between object and agency of measurement, inherent in our very idea of observation.
Bohr clearly brings out the unphysical aspects of the basic action formula
  • $E\tau = I \lambda = h$,  
where energy $E$ and momentum $I$ related to particle are combined with period $\tau$ and wave-length $\lambda$ related to wave.

Bohr then seeks to resolve the contradiction by naming it complementarity as an effect of interaction between instrument and object:
  • Consequently, evidence obtained under different experimental conditions cannot be comprehended within a single picture, but must be regarded as complementary in the sense that only the totality of the phenomena exhausts the possible information about the objects. 
  • In quantum mechanics, however, evidence about atomic objects obtained by different experimental arrangements exhibits a novel kind of complementary relationship. 
  • … the notion of complementarity simply characterizes the answers we can receive by such inquiry, whenever the interaction between the measuring instruments and the objects form an integral part of the phenomena. 
Bohr's complementarity principle has been questioned by many over the years:
  • Bohr’s interpretation of quantum mechanics has been criticized as incoherent and opportunistic, and based on doubtful philosophical premises. (Simon Saunders)
  • Despite the expenditure of much effort, I have been unable to obtain a clear understanding of Bohr’s principle of complementarity (Einstein).
Of course an object may have complementary qualities such as e.g. color and weight, which can be measured in different experiments, but it is meaningless to form a new concept as color times weight or colorweight and then desperately seek to give it a meaning. 

In the New View presented on Computational Blackbody Radiation the concept of action as e.g position times velocity has a meaning in a threshold condition for dissipation, but is not a measure of a quantity which is carried by a physical object such as mass and energy.

The ruling Copenhagen interpretation was developed by Bohr contributing a complementarity principle and Heisenberg contributing a related uncertainty principle based position times momentum (or velocity) as Bohr's unphysical complementary combination. The uncertainty principle is often expressed as a lower bound on the product of weighted norms of a function and its Fourier transform, and then interpreted as combat between localization in space and frequency or between particle and wave. In this form of the uncertainty principle the unphysical aspect of a product of position and frequency is hidden by mathematics.

The Copenhagen Interpretation was completed by Born's suggestion to view (the square of the modulus of) Schrödinger's wave function as a probability distribution for particle configuration, which in the absence of something better became the accepted way to handle the apparent wave-particle contradiction, by viewing it as a combination of probability wave with particle distribution.     


New Uncertainty Principle as Wien's Displacement Law


The recent series of posts based on Computational Blackbody Radiation suggest that Heisenberg's Uncertainty Principle can be understood as a consequence of Wien's Displacement Law expressing high-frequency cut-off in blackbody radiation scaling with temperature according to Planck's radiation law:
  • $B_\nu (T)=\gamma\nu^2T\times \theta(\nu ,T)$,
where $B_\nu (T)$ is radiated energy per unit frequency, surface area, viewing angle and second, $\gamma =\frac{2k}{c^2}$ where $k = 1.3806488\times  10^{-23} m^2 kg/s^2 K$ is Boltzmann's constant and $c$ the speed of light in $m/s$, $T$ is temperature in Kelvin $K$,
  • $\theta (\nu ,T)=\frac{\alpha}{e^\alpha -1}$, 
  • $\alpha=\frac{h\nu}{kT}$,
where $\theta (\nu ,T)\approx 1$ for $\alpha < 1$ and $\theta (\nu ,T)\approx 0$ for $\alpha > 10$ as high frequency cut-off with $h=6.626\times 10^{-34}\, Js$ Planck's constant. More precisely, maximal radiance for a given temperature occurs $T$ for $\alpha \approx 2.821$ with corresponding frequency
  • $\nu_{max} = 2.821\frac{T}{\hat h}$ where $\hat h=\frac{h}{k}=4.8\times 10^{-11}\, Ks$, 
with a rapid drop for $\nu >\nu_{max}$.

The proof of Planck's Law in Computational Blackbody Radiation explains the high frequency cut-off as a consequence of finite precision computation introducing a dissipative effect damping high-frequencies.

A connection to Heisenbergs Uncertainty Principle can be made by noting that a high-frequency cut-off condition of the form
  • $\nu < \frac{T}{\hat h}$,
can be rephrased in the following form connecting to Heisenberg's Uncertainty Principle:
  • $u_\nu\dot u_\nu > \hat h$                  (New Uncertainty Principle)
where $u_\nu$ is position amplitude, $\dot u_\nu =\nu u_\nu$ is velocity amplitude of a wave of frequency $\nu$ with $\dot u_\nu^2 =T$.

The New Uncertainty Principle expresses that observation/detection of a wave, that is observation/detection of amplitude $u$ and frequency $\nu =\frac{\dot u}{u}$ of a wave, requires
  • $u\dot u>\hat h$.
The New Uncertainty Principle concerns observation/detection amplitude and frequency as physical aspects of wave motion, and not as Heisenberg's Uncertainty Principle particle position and wave frequency as unphysical complementary aspects.

söndag 16 mars 2014

Uncertainty Principle, Whispering and Looking at a Faint Star


The recent series of posts on Heisenberg's Uncertainty Principle based on Computational Blackbody Radiation suggests the following alternative equivalent formulations of the principle:
  1. $\nu < \frac{T}{\hat h}$,
  2. $u_\nu\dot u_\nu > \hat h$,
where $u_\nu$ is position amplitude, $\dot u_\nu =\nu u_\nu$ is velocity amplitude of a wave of frequency $\nu$ with $\dot u_\nu^2 =T$, and $\hat h =4.8\times 10^{-11}Ks$ is Planck's constant scaled with Boltzmann's constant.

Here, 1 represents Wien's displacement law stating that the radiation from a body is subject to a frequency limit scaling with temperature $T$ with the factor $\frac{1}{\hat h}$.

2 is superficially similar to Heisenberg's Uncertainty Principle as an expression of the following physics: In order to detect a wave of amplitude $u$, it is necessary that the frequency $\nu$ of the wave satisfies $\nu u^2>h$. In particular, if the amplitude $u$ is small, then the frequency $\nu$ must be large.

This connects to (i) communication by whispering and (ii) viewing a distant star, both being based on the possibility of detecting small amplitude high-frequency waves.

The standard presentation of Heisenberg's Uncertainty Principle is loaded with contradictions:
  • The uncertainty principle is certainly one of the most famous and important aspects of quantum mechanics. 
  • But what is the exact meaning of this principle, and indeed, is it really a principle of quantum mechanics? And, in particular, what does it mean to say that a quantity is determined only up to some uncertainty? 
  • So the question may be asked what alternative views of the uncertainty relations are still viable.
  • Of course, this problem is intimately connected with that of the interpretation of the wave function, and hence of quantum mechanics as a whole. 
  • Since there is no consensus about the latter, one cannot expect consensus about the interpretation of the uncertainty relations either. 
In other words, today there is no consensus on the meaning of Heisenberg's Uncertainty principle. The reason may be that it has no meaning, but that there is an alternative which is meaningful.

Notice in particular that the product of two complementary or conjugate variables such as position and momentum is questionable if viewed as representing a physical quantity, while as threshold it can make sense.

torsdag 13 mars 2014

Against Measurement Against Copenhagen: For Rationality and Reality by Computation


John Bell's Against Measurement is a direct attack onto the heart of quantum mechanics as expressed in the Copenhagen Interpretation according to Bohr:
  • It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature…
Bell poses the following questions:
  • What exactly qualifies some physical systems to play the role of "measurer"?
  • Was the wavefunction of the world waiting to jump for thousands of millions of years until a single-celled living creature appeared?
  • Or did it have to wait a little longer, for some better qualified system…with a Ph D? 
Physicists of today have no answers, with far-reaching consequences for all of science: If there is no rationality and reality in physics as the most rational and real of all sciences, then there can be no rationality and reality anywhere…If real physics is not about what is, then real physics is irrational and irreal…and then…any bubble can inflate to any size...

The story is well described by 1969 Nobel Laureate Murray Gell-Mann:
  • Niels Bohr brainwashed a whole generation of theorists into thinking that the job of interpreting quantum theory was done 50 years ago.
But there is hope today, in digital simulation which offers observation without interference. Solving Schrödinger's equation by computation gives information about physical states without touching the physics. It opens a road to bring physics back to the rationality of 19th century physics in the quantum nano-world of today…without quantum computing...  

Increasing Uncertainty about Heisenberg's Uncertainty Principle + Resolution

                My mind was formed by studying philosophy, Plato and that sort of thing….The reality we can put into words is never reality itself…The atoms or elementary particles themselves are not real; they form a world of potentialities or possibilities rather than one of things or facts...If we omitted all that is unclear, we would probably be left completely uninteresting and trivial tautologies...

The 2012 article Violation of Heisenberg’s Measurement-Disturbance Relationship by Weak Measurements  by Lee A. Rozema et al, informs us: 
  • The Heisenberg Uncertainty Principle is one of the cornerstones of quantum mechanics. 
  • In his original paper on the subject, Heisenberg wrote “At the instant of time when the position is determined, that is, at the instant when the photon is scattered by the electron, the electron undergoes a discontinuous change in momentum. This change is the greater the smaller the wavelength of the light employed, i.e., the more exact the determination of the position”. 
  • The modern version of the uncertainty principle proved in our textbooks today, however, deals not with the precision of a measurement and the disturbance it introduces, but with the intrinsic uncertainty any quantum state must possess, regardless of what measurement (if any) is performed. 
  • It has been shown that the original formulation is in fact mathematically incorrect.
OK, so we learn that Heisenberg's Uncertainty Principle (in its original formulation presumably) is a cornerstone of quantum physics, which however is mathematically incorrect, and that there is a modern version not concerned with measurement but with an intrinsic uncertainty of an quantum state regardless of measurement. In other words, a corner stone of quantum mechanics has been moved. 

  • The uncertainty principle (UP) occupies a peculiar position on physics. On the one hand, it is often regarded as the hallmark of quantum mechanics. 
  • On the other hand, there is still a great deal of discussion about what it actually says. 
  • A physicist will have much more difficulty in giving a precise  formulation than in stating e.g. the principle of relativity (which is itself not easy). 
  • Moreover, the formulation given by various physicists will differ greatly not only in their wording but also in their meaning.  
We learn that the uncertainty of the uncertainty principle has been steadily increasing ever since it was formulated by Heisenberg in 1927. 

In a recent series of posts based on Computational Blackbody Radiation I have suggested a new approach to the uncertainty principle as a high-frequency cut-off condition of the form
  • $\nu < \frac{T}{\hat h}$,  
where $\nu$ is frequency, $T$ temperature in Kelvin $K$ and $\hat h=4.8\times 10^{-11}Ks$ is a scaled Planck's constant, and the significance of the cut-off is that a body of temperature $T\, K$ cannot emit frequencies larger than $\frac{T}{h}$ because the wave synchronization required for emission is destroyed by internal friction damping these frequencies.  The cut-off condition thus expresses Wien's displacement law. 

The cut-off condition can alternatively be expressed as 
  • $u_\nu\dot u_\nu > \hat h$ 
where $u_\nu$ is amplitude and $\dot u_\nu =\frac{du_\nu}{dt}$ momentum of a wave of frequency $\nu$ with $\dot u_\nu^2 =T$ and $\dot u_\nu =\nu u_\nu$. We see that the cut-off condition has superficially a form similar to Heisenberg's uncertainty principle, but that the meaning is entirely different and in fact familiar as Wien's displacement law. 

We thus find that Heisenberg's uncertainty principle can be replaced by Wien's displacement law, which can be seen as an effect of internal friction preventing synchronization and thus emission of frequencies  $\nu > \frac{T}{\hat h}$.

The high-frequency cut-off condition with its dependence on temperature is similar to high-frequency damping of a loud speaker which can depend on the level of the sound. 

tisdag 11 mars 2014

Wikileaks: Heisenberg's Uncertainty Principle Fundamentally Misleading!


Wikipedia gives the following information about Heisenberg's uncertainty principle:
  • In quantum mechanics, the uncertainty principle is any of a variety of mathematical inequalities asserting a fundamental limit to the precision with which certain pairs of physical properties of a particle known as complementary variables, such as position x and momentum p, can be known simultaneously.
  • Though widely repeated in textbooks, this physical argument is now known to be fundamentally misleading. While the act of measurement does lead to uncertainty, the loss of precision is less than that predicted by Heisenberg's argument; the formal mathematical result remains valid, however.
  • Thus, the uncertainty principle actually states a fundamental property of quantum systems, and is not a statement about the observational success of current technology. 
  • It must be emphasized that measurement does not mean only a process in which a physicist-observer takes part, but rather any interaction between classical and quantum objects regardless of any observer.
  • Thus, actually states a fundamental property of quantum systems, and is not a statement about the observational success of current technology.
The uncertainty principle, which in current text books on quantum mechanics serves a fundamental role, is in fact fundamentally misleading. Wow! This must be a leak from a true Wikileaks whistleblower. Imagine what will happen if this message is understood by the scientific community.

The leak opens to a fresh look at the uncertainty principle as in Computational Blackbody Radiation suggesting the following fundamental property of atomistic quantum systems: Finite precision computation introduces a high frequency cut-off as expressed in Plank's law:
  • $\nu < \frac{T}{\hat h}$ where $\hat h =\frac{h}{k}$, 
with $\nu$ frequency, $h$ Planck's constant, $k$ Bolzmann's constant and $\hat h =4.8\times 10^{-11}\, Ks$. Planck's constant $\hat h$ is then determined by the reference blackbody as the blackbody with maximal cut-off frequency (smallest $\hat h$) = peep hole of empty box with graphite walls.

The high frequency cut-off can alternatively be expressed as a restriction on wave length $\lambda$ of the form
  • $\lambda > \frac{c}{T}\hat h$,
which can be seen as a smallest coordination length required for emission of radiation from atomistic oscillation subject to finite precision.

To see the connection to Heisenberg's uncertainty principle, consider a wave of frequency $\nu$ of amplitude $u_\nu$ with $\dot u_\nu \equiv\frac{du_\nu}{dt}=\nu u_\nu$ and $T =\dot u_\nu^2$ for which the high-frequency cut-off condition $\nu < \frac{T}{\hat h}$, can be expressed as
  • $\dot u_\nu u_\nu  >  h$.
We see that high-frequency cut-off from finite precision computation can be seen as a substitute for an uncertainty principle which today is viewed as fundamentally misleading. 

We note that the idea of viewing the uncertainty principle as a relation between a function and its Fourier transform also seems to be fundamentally misleading. 

måndag 10 mars 2014

New Objective View of Heisenberg's Uncertainty Principle

Heisenberg's Uncertainty Principle stating a lower bound of accuracy in observation of position $x$ and momentum (velocity) $p$.

Computational Blackbody Radiation gives a new view on Planck's constant $h$ as effectively a high-frequency cut-off: Only frequencies $\nu$ such that
  • $\nu < \frac{T}{\hat h}$,
will be radiated, where $T$ is temperature in Kelvin K, and $\hat h =\frac{h}{k} \approx 4.8\times 10^{-11}\, Ks$ where $k$ is Planck's constant. The cut-off condition can alternatively be expressed as 
  1. $u_\nu\dot u_\nu > \hat h$
where $u_\nu$ is amplitude of wave frequency $\nu$ and $\dot u_\nu=\frac{du}{dt}=\nu u$ and $\dot u_\nu^2 =T$. 

This relation 1 is similar to Heisenberg's Uncertainty Principle as a lower bound on the product of position (amplitude) and velocity, but with a different physical meaning. Whereas Heisenberg's Uncertainty Principle concerns the product of errors in position $\Delta x$ and momentum/velocity $\Delta p$ vs Planck's constant $h$, the relation 1 concerns the product of amplitude and velocity vs the scaled Planck constant $\hat h$.  

The relation expresses that radiation of a certain frequency $\nu$ requires either a sufficiently large amplitude $u_\nu$ or velocity $\dot u_\nu$, as a requirement for coordinated oscillation under finite precision computation.

We thus replace uncertainty in observation by finite precision in actual physics, which reduces the subjective observer aspect of Heisenberg's Uncertainty Principle.  There is a connection to observation in finite precision computation cut-off of high frequencies, in the sense that only frequencies which effectively are emitted, can be observed. It is here not the observer who sets limits of observational accuracy by interacting with the observed object, but rather the object itself.

We recall that the central idea is to view physics as analog finite precision computation, which can be simulated by digitial computation allowing observation without interference and thus eliminates a basic difficulty in quantum mechanics.      

The Real Physical Meaning of Planck's Constant

The mystery of discrete lumps or quanta and the strange energy relation $E=h\nu$ with strange dimension of energy x time. 

Planck's constant $h = 6.626 \times 10^{-34} Js$ is supposed to represent a fundamental property of the Universe we happen to live in.

Prandtl introduced $h$ as a fictional mysterious quantity in his proof of Planck's radiation law in 1900 based on statistics with $h$ representing a smallest "quantum of action". Today 114 years later the fiction and mystery remains, and it is time to pass on to reality.  In the recent series of posts we have seen that $h$ enters into Planck's law in a high-frequency cut-off condition of the form
  • $\nu > \frac{kT}{h}=\frac{T}{\hat h}$ where $\hat h =\frac{h}{k}$,
$\nu$ is frequency, $T$ is temperature in Kelvin $K$ and $k = 1.3806488\times 10^{−23}\, J/K$. Here  
  • $\hat h =4.8\times 10^{-11}\, Ks$,
shows up as the real effective Planck constant in Planck's law. 

We see that $\frac{T}{\hat h}$ acts as a threshold value for frequency $\nu$, or equivalently $\hat h\frac{c}{T}$ acts as a threshold value for wave length $\lambda =\frac{c}{\nu}\, m$ with $c\, m/s$ the speed of light. The real effective Planck constant $\hat h$ thus has the form of a material parameter for a blackbody as a web of oscillators with a characteristic high-frequency cut-off  $\frac{T}{\hat h}$ or wave-length cut-off $\hat h\frac{c}{T}$, which expresses Wien's displacement law.

The mystery of $h$ as a "smallest quantum of action" thus can be deconstructed and the real meaning as $\hat h$ can be readily understood, all following Einstein's device to make everything as simple as possible, but not simpler. 

For the full story, see Computational Blackbody Radiation.