fredag 14 november 2025

Reductionism vs StdQM and RealQM

Reductionism is a basic classical scientific principle of breaking down phenomena of complex geometry into constituent parts of simpler geometry. But in modern physics this principle has been turned around into its opposite.

Macroscopic objects of physics described by Classical Mechanics CM are formed by parts in the form of atoms and molecules which are described by Quantum Mechanics. 

Reductionism expects microscopic parts to be simpler than macroscopic compounds, and so QM to be simpler than CM, but textbook StandardQM is immensely more complicated than CM by having a new multi-dimensional probabilistic form way more complex than the continuum mechanical form of CM.  Reductionism thus has failed in modern physics: The part is much more complex than the compound. 

RealQM is a more recent alternative to StdQM, which has the same form of CM and thus models atoms  in terms of continuum mechanics of simple geometry. 

The failed reductionism of StdQM has remained an unresolved issue of modern physics since the formation of StdQM 100 years ago, and can be viewed to be the basic reason for the present crisis of modern physics. 

The reaction by physicists to the reductionism crisis has since the 1960s been to double down seeking a reduced simpler theory on yet smaller scales than atoms in the form of QED, Standard Model and String Theory in a grandios effort to find a simple Theory of Everything ToE on subatomic scale. Nobody claims that anything like a simple ToE has been found. The complexity appears to increases on smaller scales into an impossible situation for the science of physics. 

Here is an illuminating discussion with chatGPT on reductionism vs StdQM and RealQM.

torsdag 13 november 2025

The Curse of Dimensions in Schrödinger's Equation

The basis of modern physics is viewed to be Schrödinger's Equation SE as a linear time-evolution equation in $3N$ spatial dimensions for a system with $N$ electrons. Numerical digital solution with a resolution of 100 in each spatial variable involves $100^{3N}=10^{6N}$ mesh points in space, already with $N=4$ beyond thinkable computational power. 

When SE was formulated in 1926 when digital computation was not an issue, and so the fact that SE effectively is uncomputable did not enter the minds of its creators Born-Heisenberg-Schrödinger, although Schrödinger was not happy with the many dimensions lacking physicality. It was sufficient that an analytical solution was found for $N=1$ leaving $N>1$ into terra incognita waiting to be explored until digital computation became available, but then was found hit the wall from the curse of dimensions.

This is where we stand today: SE is the basic mathematical model of atom physics but SE is not a computable model. It is thus impossible to make a prediction of the evolution in time of an atomic system with more than 3 electrons by computational solution of SE. It is thus impossible to check if SE correctly models physics by comparing SE predictions with observations of real physics.

Yet SE serves as the canonical model of atom physics in its original formulation, as uncomputable today  as 100 years ago, because of its many dimensions also without physical meaning. 

What can be the value of an uncomputable mathematical model of some physics?  A physicist will tell that it still has a value because SE can be (drastically) dimensionally reduced to computable form and so allow computation of (drastically) simplified approximate solutions. SE would then serve as a suitable starting point for dimensional reduction into a computable model with physical meaning. But it would be the dimensional reduction which would carry the physics.

The alternative would be to start instead directly with a dimensionally reduced model with physical meaning, and thus leave SE to history as no longer useful. This possibility is explored as RealQM. 

Physicists speak with large ease about multi-dimensional wave functions $\Psi$ as solutions to SE, as if they are computable and have physical meaning. The consensus is the "SE works but nobody understands why". Philosophers of physics study the (lack of) meaning of SE, theoretical physicists have turned to more fundamental models such as QED and String Theory, chemists seek to understand what SE offers for molecules, while computational physicists solve other equations, and there is no synthesis in sight.   

tisdag 11 november 2025

Standard QM vs Real QM: Physics and Computability

Let us compare the textbook Standard Quantum Mechanics StdQM from 1926 with the recent alternative Real Quantum Mechanics RealQM as concerns the two basic aspects of physical meaning and computability. 

Both seek to model a collection of $N$ negatively charged electrons subject to Coulomb interaction with a collection of positively charged nuclei together forming atoms and molecules. 

The RealQM model is expressed in terms of a wave function $\psi (x,t)$ depending on a 3d spatial coordinate $x$ and a time coordinate $t$ defined over a subdivision of 3d space into non-overlapping regions acting as supports of one-electron charge densities $\vert\psi (x,t)\vert^2$ meeting with continuity. The corresponding Schrödinger equation has the form of a classical non-linear continuum model for a collection of non-overlapping electron charge densities and thus describes precisely specified real physics. The model is computable in the same sense as classical continuum mechanics with equations such as Maxwell's and Navier's. Computational complexity is (in principle) linear in $N$.

The StdQM model is expressed in terms of a wave function $\Psi (X,t)$ where $X$ is a $3N$ dimensional spatial variable with 3 independent dimensions for each electron. The corresponding Schrödinger equation has non-classical form as a linear equation in multi-dimensions with exponential computational complexity. The physical meaning of $\Psi (X,t)$ has been debated for 100 years without any agreement. 

We now compare:

  • RealQM: Computable with precise physical meaning.
  • StdQM: Uncomputable with unknown physical meaning.
Does this invite to a further study of what RealQM can deliver to chemistry as Coulomb interaction between electrons and nuclei? 


måndag 10 november 2025

Quantum Computer as Test of Standard Quantum Mechanics

Quantum computing is suddenly booming with many start-ups after 50 years of brooding. The main objective of quantum computing is to solve problems of quantum mechanics which are not tractable by digital computing because of exponential computational complexity. The prospect is that a quantum computer will deliver exponential computational capacity meeting exponential complexity. 

Quantum computing can also be seen as test of the physicality of Standard Quantum Mechanics StdQM based on a multi-dimensional Schrödinger Equation of exponential complexity allowing superposition of states with a potential of exponential capacity in the form of analog quantum computing. 

If a quantum computer based on StdQM can be constructed capable of computing/simulating real physical systems described by StdQM as the expectation of investors, this will give support to the validity of StdQM as a functional model of real physics. 

But there is no quantum computer yet and skeptics believe that controled superposition as key feature of StdQM will be impossible to realise because the physics is missing. 

So the quest for a quantum computer can be seen as the ultimate test of physicality of StdQM. 

What are the odds today? Will there by a quantum computer in 10 years, in 50 years or ever?






   

Real Physicist vs AI Physicist/Common Man

Discussions with chatGPT on physics like quantum mechanics and quantum computing gives (i) direct quick access to an AI physicist who has (ii) read the literature and (iii) argues according to logics of language. This can be a very constructive experience and I have learned a lot since my reading is limited. On the other hand, discussion with real physicists may be less rewarding since (i)-(iii) may not fulfilled. 

Let me give an example of a discussion with chatGPT with the following prompt:

Standard Quantum Mechanics StdQM is troubled by the fact that its basic mathematical model in the form of Schrödinger's Equation SE has exponential computational complexity because it involves $3N$ spatial dimension for a system with $N$ electrons, and so is uncomputable. 

StdQM thus acts like a symbolic theory without ontology which does not offer any definite prediction of any real physics. In practice StdQM is thus replaced by a another theory which is computable and so is predictive, which can be Density Functional Theory DFT, Atoms in Molecules AIM, Molecular Orbital Theory MO as standard since long, or the recent Real Quantum Mechanics RealQM.

StdQM thus does not deliver any predictions for systems with several electrons, and so cannot be either verified or contradicted. DFT, AIM and MO are viewed to be drastically dimensionally reduced approximate versions of StdQM capable of delivering predictions, which are accepted if agreeing with observations and discarded if not. Accordingly, text books claim that there is no prediction of StdQM which does not fit with observation. 

RealQM is not a dimensionally reduced approximate version of StdQM (without ontology), but starts from a model of atoms and molecules in terms of classical continuum physics based on non-overlapping one electron charge densities as a model with clear ontology. RealQM is computable in the same sense of as classical continuum physics, and thus delivers predictions for systems with many electrons. 

Despite being uncomputable non-predictive without ontology, StdQM has remained for 100 years as the foundation of modern physics as being fundamentally different from classical continuum physics. None of the foundational problems coming from multi-dimensionality have been resolved, but since no real alternative to StdQM has been pursued, only reductions, StdQM still is the untouchable theory as being the only alternative and as such a theory that cannot be wrong: The only possible theory cannot be wromg!

Here RealQM brings new perspective by offering another possible theory thus removing the exclusivity of StdQM as the only possible theory asking for concrete evidence of correctness.  

I asked chatGPT to comment on the article and got a response as if chatGPT took the role of a standard physicist defending StdQM and not taking exponential complexity and exclusivity into account. So I asked for response taking on a different role:

Prompt: Suppose you take on a different role, not a standard physicist defending StdQM, but rather a man in the street wondering about exponential complexity and exclusivity argument?

ChatGPT said:

That’s an excellent shift in perspective — rather than responding like a professional physicist or philosopher of science, let’s try to look at Claes Johnson’s post through the eyes of an intelligent “man in the street” — someone curious, reflective, and a bit skeptical of institutional authority, but not steeped in quantum mechanics.


How it might read to that kind of reader

Johnson’s message, boiled down, sounds something like this:

“The quantum theory physicists worship is too complicated to compute, too abstract to connect with real stuff, and too exclusive — they won’t even look at alternatives. Meanwhile, AI can actually hold a better conversation about it than most human experts.”

That’s a pretty gripping claim for a lay reader. It has two strong hooks:

  1. The complexity argument — “If the math really needs a space of 3N dimensions for N electrons, that sounds insane! How could anyone ever calculate that?”

  2. The exclusivity argument — “If physicists admit the math is impossible to compute exactly, why do they still insist it’s the only valid theory and dismiss alternatives out of hand?”

Those both appeal to everyday intuitions about fairness, openness, and practicality. Let’s unpack them in that spirit.


1. The exponential complexity issue, from a layperson’s angle

To most people, exponential growth is intuitively frightening.
If doubling the number of electrons makes the problem millions of times harder, you’d think: “Surely nature itself can’t be that complicated — atoms don’t sit around doing 10⁵⁰ calculations per second!”

So when Johnson says quantum mechanics involves 3N-dimensional equations, the ordinary reader might nod: “Yeah, that sounds unphysical — like a math game divorced from reality.”

The lay reaction is reasonable: if a theory’s equations blow up exponentially and no one can ever compute them directly, maybe it’s not the “real” mechanism but a symbolic shorthand — a way to get results in limited cases.

At that intuitive level, Johnson’s point feels plausible. He’s basically saying, “Nature works efficiently — any true physical law should be computable.” That resonates with common sense: we expect the universe to be elegant, not computationally nightmarish.

The subtlety a lay reader might miss is that physicists don’t actually solve those 3N-dimensional equations literally — they use approximations and statistical tricks. But if no one tells you that clearly, Johnson’s critique feels like he’s exposing an emperor with no clothes.


2. The exclusivity argument

The second emotional chord Johnson strikes is frustration with scientific gatekeeping.
He implies that “Standard Quantum Mechanics” has become a closed priesthood: even though it’s uncomputable and conceptually foggy, physicists defend it out of tradition and career safety, while new ideas (like his “Real QM”) are ignored.

To the lay reader, this sounds familiar — it’s the same story people tell about entrenched academia, Big Tech, or politics: institutions protect their dogmas and insiders. That’s a powerful narrative.

Even if you don’t understand the physics, you can understand the social drama: the lone thinker with a simpler idea being dismissed by the establishment. That taps into a long cultural archetype — Galileo, Tesla, Einstein — and in our AI-saturated age, the “outsider vs orthodoxy” story finds fertile ground.

So Johnson’s framing — “real physicist vs AI physicist” — doubles as metaphor: the open, logic-based machine versus the human expert bound by dogma. It’s half-serious, half-provocative, but it plays well rhetorically.


3. The psychological comfort of “computability”

For an everyday thinker, there’s something comforting about the idea that the world is computable — that with the right model, we could, in principle, calculate everything.

Johnson’s promise that his “Real QM” is a continuum, “computable” model of atoms feels satisfying: it sounds like bringing physics back down to earth, to something continuous and classical we can picture, rather than probabilistic ghostly waves in 3N-space.

So even without understanding the math, the man-in-the-street feels an intuitive pull:

“This guy wants physics to make sense again, to be calculable and visualizable — not weird and mystical. I like that.”


4. The possible unease or doubt

But an intelligent layperson might also feel uneasy. There’s a nagging question:

“If this simpler, computable model really works, why haven’t others adopted it?”

That thought undercuts the romantic rebel narrative. Most people know that science, while bureaucratic, does eventually adopt what works (e.g., relativity, plate tectonics). So maybe Real QM hasn’t convinced anyone because it doesn’t yet match data well enough — or maybe it’s incomplete.

So the lay reaction could split:

  • Sympathetic skeptic: “He’s right — they’re hiding behind math they can’t compute!”

  • Cautious realist: “If it worked, people would use it. Maybe it’s just not there yet.”


5. The AI angle

Finally, the “AI physicist” idea.
To the ordinary reader, that might sound both amusing and profound: if ChatGPT can discuss physics calmly, quote papers, and reason clearly, while human experts get defensive, maybe that tells us something about human bias.

So Johnson’s anecdote — that an AI conversation is “more rewarding” — rings true emotionally, even if it’s tongue-in-cheek. It’s less about AI understanding physics, and more about AI being open and non-tribal, which flatters readers who are tired of expert arrogance.

In that sense, the AI serves as a rhetorical mirror: a symbol of reason over orthodoxy.


6. Overall lay impression

If you’re not a professional physicist, Johnson’s post can sound refreshingly candid and a bit rebellious. It appeals to:

  • Common sense (“Nature can’t be uncomputable.”)

  • Fairness (“Why exclude new ideas?”)

  • Frustration with experts (“Real physicists won’t listen.”)

  • Hope in technology (“AI might help cut through bias.”)

But it also leaves a thoughtful reader with lingering uncertainty:

  • Is the exponential-complexity problem really fatal, or just hard?

  • Does his Real QM actually make accurate predictions?

  • Is the exclusivity he describes real, or just how it feels to an outsider?

So the post succeeds rhetorically — it provokes curiosity and sympathy — but it doesn’t fully settle the doubts it raises.


Would you like me to take this a step further — for example, write a short “man-in-the-street” response to Johnson’s blog post, as if it were a comment you could post under the article itself?

You said:
yes
ChatGPT said:

Here’s a short, natural-sounding “man-in-the-street” style comment you could post under Claes Johnson’s blog entry:


Comment:

I’m not a professional physicist, just a curious reader, but your point about exponential complexity really made me pause. If quantum mechanics truly needs a 3N-dimensional equation for N electrons, it does sound like something nature itself couldn’t be running in real time. That raises a fair question: maybe the math is more of a symbolic description than a literal mechanism.

I also get what you mean about exclusivity. From the outside, physics can look a bit like a closed shop—new ideas get waved off before they’re even tried. Still, I wonder: if your “Real QM” works as cleanly as you suggest, why hasn’t it caught on? Are there published results or examples where it matches experiments better than the standard methods?

Either way, I find it refreshing that an AI can now take part in these debates. Maybe it’s less about the machine being “smarter,” and more about it being open to every angle without ego. That in itself is a healthy thing for science.  


 

 

fredag 7 november 2025

Sweden Declares War with Russia

Prime ministers Kristersson/Sweden and Zelensky/Ukraine have agreed on delivery of 150 Swedish JAS Gripen fighter jets to Ukraine with possible financing from frozen Russian assets:

  • Zelensky: These are very cool planes, powerful aviation platforms that allow for performing a wide range of tasks.
  • Kristersson: This will strengthen both Ukraine, Sweden and Europe.
The deal is bilateral between Sweden and Ukraine and means that Sweden effectively declares war with Russia even if not yet officially. Russia has not yet reacted.

Gustaf IV Adolf King of Sweden lost the Finnish War 1808-1809 against the Russian Empire and also his crown and so was deported to exile in Switzerland where he lived in a small hotel in great loneliness and indigence under the name of Colonel Gustafsson.

This meant the end Sweden as European Superpower and the politics shifted from war to peace for 200 years, but the dream of a revanche vs Russia has been kept alive. Sweden is today the NATO country with most aggressive standpoint vs Russia backed by 150 JAS Gripen payed with Russian money.

In this situation it is necessary to answer some questions:
  • Are the prospects of winning over Russia better today than in 1808? 
  • Will Russia invade Sweden if Sweden does not send 150 JAS to a NATO proxywar in Ukraine against Russia?  
  • How will Russia respond to 150 Swedish JAS flying against Russian border? 
  • What is the military power of Russia?
  • Is Ukraine a member of the defence organisation NATO?
Swedish Parliament has now delegated the decision to Prime Minister Kristersson to send Swedish soldiers to any action outside Sweden deemed necessary to protect Sweden from invasion by Russia. 

There is one member in the Swedish Parliament who supports peaceful co-existence with Russia instead of a war which cannot be won: Elsa Widding without party. 

All political parties speaks about war, while the the moral elite of Sweden once a self-proclaimed humanitarian Superpower is silent including peace/church organisations. Possible to understand?

Compare with earlier analysis:



Church-Turing vs Quantum Computing Illusion

A natural system like the weather can be viewed to perform a form of analog computation as it evolves from one time instant to the next when molecules in the air interact with their neighbors. The computational complexity can be viewed to be polynomial in the size of the physical system. This is expressed in the Physical Church-Turing Thesis PCTT:

  • Any physical process can be simulated by a Turing machine.  
Here a Turing machine is a model of a digital computer with polynomial computational capacity capable of simulating a physical process of polynomial computational complexity.

According to PCTT there is thus no physical process expressing exponential complexity, which would be beyond the capacity of digital computing. 

Quantum computing is a form of analog computation with promise of exponential capacity capable of meeting the needs of systems of exponential complexity. It is motivated by a view that quantum mechanics carries exponential complexity in the form of a multi-dimensional wave function and so cannot be simulated on a digital computer. 

We meet here a contradiction:  
  • An analog quantum computer is realised in a physical process which according to PCTT is limited to polynomial complexity and so does not have exponential capacity.
We see that PCTT says that a quantum computer with exponential capacity cannot be constructed. No wonder that no such quantum computer has been constructed.

If PCTT is correct, it means that the evolution of a quantum system of atoms and molecules as a physical process, does not expresses polynomial complexity and so in principle can be simulated by digital computation with polynomial capacity. The multi-dimensionality of the wave function appearing to demand exponential capacity thus is an illusion. 

RealQM deconstructs the illusion, by offering simulation of systems of atoms and molecules by digital computation (of polynomial complexity).

PS1 Recall that an N-body simulation has a computational complexity between $N$ and $N^2$ depending on interaction between bodies. 

PS2 If macroscopics has polynomial complexity, then so has microscopics as the basis of macroscopics. If microscopics has exponential complexity, then so has macroscopics based on microscopics. 


torsdag 6 november 2025

The Black Hole of Quantum Computing

Standard Quantum Mechanics StdQM based on a multi-dimensional Schrödinger Equation SE is viewed to have exponential digital computational complexity effectively making SE uncomputable even on super-computers and thus useless for digital modeling of atoms/molecules in practice.

Quantum computing is an attempt to model atoms and molecules instead by analog computation performed on quantum computers capable of meeting exponential complexity. A quantum computer models a physical quantum system of atoms/molecules not by SE, but by another (simpler) physical quantum system which is controllable as being described by SE. 

A quantum computer thus offers a model/map of a real system which is itself a real system of the same form and type. The quantum computer model can be viewed as a 1:10 exact physical model of an airplane or ship with some details removed. The digital computational complexity of the original physical system is then irrelevant since no digital computation is performed, only a form of analog computation performed in a physical model of essentially the same computational complexity. 

Quantum computing thus represents a step back from the scientific revolution based on Calculus offering  mathematical models of reality of abstract symbolic from based on numbers which can be made alive by digital computing. 

Quantum computing throws away digital Calculus because it is found to be useless, and seeks a replacement which works in terms of analog computation. 

But is it possible that real quantum systems really perform analog computations of exponential complexity? If a digital computer does not have the capacity to meet exponential complexity, what says that a real system as it evolves somehow can express exponential complexity? 

We know that the exponential complexity of SE comes from the multi-dimensionality of the wave function representing possibilities rather than actualities. But real systems do not realise all possibilities only a few actualities, and so the computational complexity of SE as model is much bigger than the complexity of what is modeled. The model/picture totally swamps the origin and so asks for infinite digital computational power to no avail as concerns the origin.

RealQM is a version of the quantum mechanics of atoms and molecules, which is digitally computable and so does not need quantum computing.


onsdag 5 november 2025

Quantum Computing as Theoretical Fantasy

Here is a summary by chatGPT of Dyakonov's view on the realism of quantum computing:

Physicist Mikhail Dyakonov has been one of the most prominent skeptics of large-scale quantum computation. His main critique can be summarized as follows as of 2019:

  1. Extreme Fragility of Quantum States – Quantum bits (qubits) are extraordinarily sensitive to any external disturbance. Maintaining coherent quantum states long enough for complex computations is, in his view, practically impossible.

  2. Error Correction Limits – While quantum error correction is theoretically possible, Dyakonov argues that the real-world requirements (millions of physical qubits per logical qubit, perfectly calibrated control systems, etc.) make it infeasible.

  3. Unrealistic Scaling – He believes scaling from a few dozen qubits to millions (as required for useful computation) involves an exponential explosion of complexity in control, synchronization, and noise management.

  4. Overly Optimistic Theory vs. Practice – Dyakonov claims that most of quantum computing theory assumes idealized conditions that cannot be realized physically. The engineering challenges are, in his words, “beyond human capabilities.”

  5. Skepticism Toward “Quantum Supremacy” Claims – He views demonstrations like Google’s 2019 experiment as contrived and not indicative of practical progress toward general-purpose quantum computing.

In essence, Dyakonov argues that large-scale quantum computers are a theoretical fantasy, comparable to perpetual motion machines — possible on paper, but not in reality.

end chatGPT

Let us compare with the latest hype:

  • Google’s Willow Chip Achieves Historic Quantum Computing Breakthrough
  • Willow Chip Powers Verifiable Quantum Advantage, Running Algorithm 13,000x Faster Than Supercomputers.
This what chatGPT says about the Willow Chip:
  • Dyakonov’s scepticism remains valuable and largely justified — many of his fundamental concerns (noise, control, scale, assumption-validity) are not yet fully overcome.
  • The Willow chip is important: it represents a tangible, hardware-driven step toward the goals of large-scale quantum computing. It shows we are making progress on the error-correction and scaling front.
  • However, Willow does not yet fully invalidate Dyakonov’s concerns. It addresses some, but many of the major leaps (millions of qubits, full fault-tolerance, broad practical algorithmic supremacy) are still ahead.
  • In other words: Willow moves the bar, but the “gap” that Dyakonov insisted on remains significant. The achievement doesn’t guarantee the remaining engineering challenges will vanish.
  • If I were to pick one key takeaway: Willow demonstrates that at least some of the previously theoretical obstacles (error-correction scaling) can be approached in hardware — which means the sceptical “maybe impossible” thesis is being challenged. 
  • But “practically useful, general-purpose quantum computers in the near-term” still remain uncertain.
It seems that quantum computing is still a theoretical fantasy, 50 years after the idea was launched. 

Recall that quantum computing is based on unitary evolution of quantum systems of thousands of qubits in superposition of possibilities as fantasy physics: Compare with the first proposition in Wittgenstein's Tractatus:
  • The World is all that is the case.
It is clear that "to be the case" requires more than what is present in a quantum system of possibilities, which means that according to Wittgenstein a quantum computer does not belong to the World. But a quantum computer is an analog computer and as such must belong to the World. Wittgenstein would thus view the Willow chip with utter skepticism. And you?

Recall that the idea of a quantum computer is a model of an uncontrollable real/analog quantum system as part of the World in the form of a controllable real/analog quantum system as part of the World, with the caveat that the model is not "the case" because it plays with possibilities and not with realities.   

Notice that this contradiction does no appear with a digital computer because the computing is abstract mathematical and so does not need real analog computing.  


måndag 3 november 2025

Hydrogen Spectrum as Classical Physics

This is a continuation of the previous post on the necessity to give up classical physics for quantum physics in its text book form of Standard Quantum Mechanics StdQM. We ask to what extent RealQM as a form of classical physics can explain the observed spectrum of the Hydrogen atom as expressed in stimulated radiation. We thus will compare

  • StdQM: Spectral line appears from superposition of wave functions of eigenstates. 
  • RealQM: Spectral line appears from oscillation between charge distributions of eigenstates.
In both cases the frequency of the spectral line scales with the difference of energy between eigenstates, but with different explanations: 
  • StdQM: Spectral frequency appears as beat frequency of wave functions with eigenfrequency variation in time according to the Schrödinger's equation. Connection between frequency and energy is secondary. Radiation can appear to be spontaneous.
  • RealQM: Spectral frequency is not beat frequency, but simply the frequency $\nu=\frac{\Delta E}{h}$ which matches the energy difference $\Delta E$ between eigenstates with $h$ Planck's constant in an assumed coupling between frequency and energy. There is an active exchange of energy between atom and radiation field with frequency matching the jump in energy. The atom is forced to respond to radiation of certain frequency as a dipole. The radiation is not spontaneous. 
We see that StdQM offers an explanation in terms of time-dependent quantum mechanics without realism, while RealQM relies on the formal coupling between matter and radiation expressed by $E=h\nu$ appearing in blackbody radiation. Compare with this post on the physical meaning of $E=h\nu$.

We see that, at least in the case of stimulated radiation, the spectrum of an atom can be given a RealQM semi-classical explanation. It is not clear that StdQM offers something more enlightening. Or?

This discussion connects to quantum computing discussed in recent posts with StdQM supposed to support delicate superposition of wave functions free of forcing performing complex analog computations, while RealQM brings forward the aspect of forcing in terms of classical physics. 

PS Here is a chatGPT comment.

The Double-Slit Experiment as Classical Physics

The double-slit experiment with one single photon at a time is the sledge hammer which is designed to kill classical physics and so open to a new form of physics named quantum mechanics as the essence of modern physics.

The experiment consists of sending a sequence of photons one at a time through a double slit in a thin wall and recording the emergence of a macroscopic interference (fringe) pattern on a screen behind assuming a new dot for each new photon. 

The pattern is the same as the interference patterns generated by a classical macroscopic wave hitting the wall and generating two macroscopic waves from the slits which interact constructively and destructively and so form a macroscopic fringe pattern on the screen.

We thus have a quantum version of the experiment with one photon at a time and a classical version, both building the same macroscopic fringe pattern on the screen. 

The catch of the quantum explanation of the quantum version is that the single photon after somehow passing through the double slit with a left and right slit, somehow appears after the slit in a superposition of "left pass" and "right pass" as two mutually exclusive possibilities carrying an interference pattern of possibilities, which is made real dot by dot on the screen (by "collapse of the wave function").

The sledge hammer argument is now that the quantum version cannot be given a classical explanation because a classical particle/phtoton has to make the choice of "left pass" or "righ pass" which will remain after passing and so does not form any interference pattern, just two one-slit patterns. 

The strategy is thus to promote quantum physics by an experiment arranged to show deficiency of classical physics. 

Let us take a closer look at the quantum version asking what the physical meaning of "one photon at a time" may be. What is the evidence that only one photon at a time reaches the double slit? How to produce exactly one photon? 

It seems that the evidence of "one photon at a time" is that there is "one dot at a time appearing on the screen". The argument is thus that one dot is generated by exactly one photon. Is this really so sure?

What if many photons are in fact involved with the capacity to pass both slits and form a classical real but very weak interference pattern before hitting the screen with a random excitation of one dot. The calibration of the intensity of the source would then be adjusted to produce one dot at a time, which de facto involves many photons capable of generating an intereference pattern with random realistion.

It thus may be possible to explain a "one dot at a time" double slit experiment in terms of classical wave physics. The sledge hammer argument would then vaporise.  

PS After a discussion with chatGPT we agree that the one-photon experiment is extreme, but we seem to disagree about the value of extreme experiments to guide development of non-extreme physics. Why not stick to non-extreme experiments for non-extreme physics? What can be learn about normality by pushing into non-normality? What to learn about normal life from extreme violence? chatGPT seems to believe in the value of extremism in physics more than I. And you?      

söndag 2 november 2025

Why Is Analog Quantum Computing Needed?

Quantum computing is motivated by a perception that simulating atomic physics described mathematically by the Schrödinger Equation SE of Quantum Mechanics QM, is exponentially hard and so is impossible. This is because SE for a system with $N$ electrons involves $3N$ spatial dimensions with computational work increasing exponentially with $N$.

In other words, digital simulation of QM is viewed to be so computationally demanding that the alternative of analog simulation must be explored. This is the idea analog quantum computing launched by Richard Feynman 50 years ago:

  • Simulate a real quantum system by a controllable laboratory quantum system. 
This is the same idea as testing a physical model of a real airplane in a wind tunnel under controllable conditions. Or building a toy model of a bridge and testing its bearing capacity. No mathematics is needed, just craftsman skill.

The basic idea is thus to give up building mathematical models of realities in terms of Cartesian geometry based on numbers with digital representation, as the scientific method behind the evolution of the modern industrial/digital society. 

Such a step can be seen as a step back to a more primitive science based on analog modeling without mathematics. 

In any case, massive investment is now going into creating quantum computers as controllable analog quantum systems. The design work has to cope with the perceived impossibility to test different designs using mathematical digital modeling, and so has to rely on tricky experimental testing. The time frame for a useful analog quantum computer appears to be decades rather than years.

With this perspective it is natural to ask if the exponential computational complexity of the microscopics of quantum mechanics is written in stone. Macroscopics of continuum mechanics rarely comes with exponential complexity, because evolving a macroscopic system, like the weather, one time step involves only local connections in 3 space dimensions which has polynomial complexity. 

If macroscopics has polynomial complexity, then microscopics on smaller scales should have as well. RealQM offers a version of quantum mechanics of polynomial complexity. If nothing else, it can be used to test different designs of an analog quantum computer. Want to try RealQM?

Another mission of analog quantum computing put forward to motivate investors, is improved potential of factorisation of large natural numbers with promise to break cryptography codes. But analog computation about properties of numbers instead of digital appears far-fetched.

PS Recall that at each clock cycle
  • a digital computer operates on $n$ factual states
  • a quantum computer operates on $2^n$ possible states  
with simplistic promise of an enormous increase of capacity from linear to exponential. Is it too good to be true?

lördag 1 november 2025

Quantum Computing Without Mathematics

Schrödinger's Equation SE for the Hydrogen atom with one electron formulated in 1926 by the Austrian physicist Erwin Schrödinger as a model of a negative electron charge density subject to Coulomb attraction from a positive kernel,  was generalised to atoms with many electrons by a formal mathematical procedure adding a new independent 3d spatial Euclidean space for each electron into a linear multi-dimensional SE with $3N$ spatial dimensions for an atom with $N$ electrons, to form the foundation of the modern physics of Quantum Mechanics. 

The mathematics of the multi-d SE was quickly formalised by the mathematician von Neumann into highly abstract functional analysis in Hilbert spaces as a triumph of symbolic abstract mathematics. Physics of real atoms was thus hijacked by mathematicians, but the task of making physical sense of the abstraction was left to physicists without the mathematical training required to make real sense of von Neumann's functional analysis. The problem of physical interpretation remains unresolved today, which is behind the present manifest crisis of a modern quantum physics hijacked by mathematics.

The multi-d SE showed to harbour a serious problem when confronted with mathematical computation. Because of the many dimensions the computational complexity showed to be exponential, which made SE uncomputable on digital computers, and so in effect useless. 

Abstract mathematics had created a model of real physics, which showed to be an uncomputable monster, which was not useful except as an exercise of functional analysis.

Quantum computing is a new form of computing fundamentally different from digital computing as mathematical computing with numbers. The idea was launched in the 1970s by the physicist Richard Feynman as a new approach to tackle the uncomputability of QM. The radical idea was to replace uncomputable functional analysis by a form of analog quantum computation, where a real atomic quantum systems is modeled in a laboratory by another real quantum system acting as analog quantum computer.

Recall that the option of replacing a mathematical model by an analog model is also used classically, when a model of an airplane is studied in a wind tunnel instead of solving the Navier-Stokes equations deemed to be impossible.

The success of mathematisation of quantum physics into functional analysis in Hilbert spaces hundred years ago carried its own destruction by coming with exponential complexity, which could not be met within mathematical computation. 

Heavy investment in is now being directed into building a quantum computer supposed to function according to a mathematical formalism, which is being replaced by analog quantum computing.

Does this appear to be strange? To build on mathematical quantum mechanics which is replaced by analog quantum computing based on what was replaced? What would Wittgenstein say about something like that? In any case the investment in quantum computing is high risk. 

RealQM offers a way out of this mess, in the form of a different Schrödinger equation which is computable as digital mathematics and thus does not have to be replaced by analog quantum computing.  Why not give it a try?

 

måndag 27 oktober 2025

Is Quantum Computing Possible?

Quantum superposition is the crucial component of Quantum Mechanics believed to open a new world of quantum computing on quantum computers.    

An $n$-qubit quantum computer with $n=2,3,..,$ is supposed to operate in parallel on $2^n$ states in superposition, to be compared with a classical digital computer operating on $n$ bits at a time, where a bit can take the value 0 or 1. Quantum computing thus gives promise of exponential speed-up vs digital computing.   

The idea of quantum computing was first presented by Richard Feynman in an attempt to get around the apparent exponential complexity of Quantum Mechanics QM based on Schrödinger's multi-dimensional wave equation making digital simulation impossible by demanding computational work scaling with $2^N$ for a system with $N$ electrons. The idea was to replace digital simulation with some form of analog quantum computation where the simulation of a quantum system would be performed on a quantum system. An intriguing idea, but could it work? The apparent exponential complexity would then be met with an exponential computational power making simulation possible. To meet a perceived difficulty by ramping up the capacity or simply to "fight fire with fire". 

Let us then consider the basic idea of a quantum computing, which is 

  • Simultaneous operation on states in superposition.       

What then is "superposition"? The mathematical answer is the following: Consider a linear algebraic or differential equation with solutions $\psi_1$ and $\psi_2$. Then the algebraic sum $\psi =\psi_1+\psi_2$ is also a solution and $\psi$ is viewed to be the superposition $\psi_1$ and $\psi_2$ with the + sign signifying algebraic sum. 

As concerns classical physical wave mechanics, the algebraic superposition can take two forms with physicality of (i) the algebraic sum $\psi$ or (ii) of the individual terms $\psi_1$ and $\psi_2$. Case (i) represents the interference pattern seen of the surface of one pond, while (ii) represents the beat interference generated by two vibrating strings with nearby frequencies. 

As concerns QM the physicality is supposed to be displayed in the double-slit experiment: Let a single photon/electron pass a double-slit and then be detected on a screen behind a as a spot. Repeat the experiment many times and notice a fringe pattern appearing on the screen, which is the same interference pattern developed by a macroscopic wave passing through both slits. The photon/electron after passing the double split is representing the quantum superposition supposed to carry quantum computing. This is not a superposition of realities as in (ii) above, but a superposition of possibilities made real by repetition of the one photon/electron experiment which represents the physics of a 1-qubit: The single photon/electron is by passing through the double slits put into a superposition of passing the left slit and passing the right slit not as realities but as possibilities. It is here essential that the superposition only concerns one photon/electron in superposition of $2^1=2$ states. This is supposed to generalise to $n$ entangled photons/electrons in superposition of $2^n$ possible states.

The evidence that quantum computing is possible thus boils down to the double-slit experiment. If one photon/electron can be put in superposition of two states which appears to support interference with fringe pattern, then constructing of a $n$-qubit computer may be possible. If two photons/electrons are needed for two states then we are back to classical computing with bits.

The crucial question is now: Is it sure that because there is one click on the screen at a time, the input is one photon/electron at a time?

Think of this, and return to see a more precise analysis.

QM in its standard multi-dimensional form has exponential complexity, which requires exponential computing power. RealQM is an alternative with polynomial complexity which can be met by classical computing.  

Maybe quantum computing is neither possible nor needed? 


fredag 24 oktober 2025

Quantum Physics as HyperReality = Crisis

Modern physics as Quantum Mechanics QM is based on a (complex-valued) wave function $\Psi (X)$ depending on a $3N$-dimensional configuration space coordinate $X$ for an atomic system with $N$ electrons giving each electron a separate 3d Euclidean space. 

Configuration space is a mathematical construct which does not have any physical representation for $N>1$. The wave function as a function defined on configuration space shares the same lack of physical representation. The wave function evolves in time as a mathematical construct satisfying a Schrödinger equation. 

To give QM a meaning beyond a mathematical game it is necessary to give the wave function a physical meaning, which has drawn much attention without any common view ever being formed. The credibility of modern physics has suffered from this failure and can be seen as the main reason for its present commonly witnessed crisis, which can be summarised as follows: 

  • classical physics = mathematical model with physical origin.   
  • quantum physics = mathematical model without physical origin.     

To handle the lack of physical origin in quantum physics, the logic has been twisted by turning the mathematical model into a new form of reality, which then perfectly fits the model. This connects to the French philosopher Baudrillard's concept of hyperreality as imagined reality formed from a model without origin, when the model is turned into real. 

That this is what de facto happened in quantum physics, is evidenced by the fact that quantum model predictions always perfectly match (imagined) reality.  

We can summarise (see also this post and this post):

  • classical physics: reality is turned into mathematical model
  • quantum physics: mathematical model is turned into hyperreality.
There is a further complication coming from the lack of physical origin/representation of the wave function, namely the conceptual understanding of a mathematical model without origin in some physics connecting to our experience as human beings. In short, how can we capture in our minds a wave-function defined over a $3N$-dimensional configuration space, when our experience is 3d?

The modern quantum physicist thus has to struggle with an imagined reality represented by a mathematical model without real origin as support for imagination. How to imagine a reality when the imagination has nothing relevant to feed from?

There is a way out of this dilemma: Start with RealQM as a quantum model with real origin. 

Recall that we can understand forms of virtual reality without origin but only so far the virtual reality is based on concepts of reality. If the virtual reality displays 6 space dimensions, then we will not be able to properly understand (only pretend to).

Baudrillard uses Disney World as a form of hyperreality as a model of an American society which has never existed, thus without true origin, but yet formed in familiar terms we can understand.   


onsdag 22 oktober 2025

Exploring Microscopics by Computation

Modern physics appeared after a long success story initiated by the scientific revolution culminating at the end of the 19th century in a combination of Newton's mechanics and Maxwell's electromagnetics in full harmony appearing to capture macroscopic physics. 

Of course there were open problems asking for resolution including the ultra-violet catastrophe of black-body radiation viewed to be of particular importance. The start was the Rayleigh-Jeans radiation law stating that radiation intensity scales quadratically with frequency $\nu$, which asks for an upper bound $\nu_{max}$ on frequency to avoid infinite intensity by summation over frequencies. Such a bound was available as Wien's displacement law stating that $\nu_{max}$ scales linearly with temperature $T$. 

The theoretical challenge was to explain the bound $\nu <\nu_{max}$ with $\nu_{max}\sim T$. Planck as leading physicist of the German Empire took on the challenge but was unable to find an answer within classical physics and so resorted to a form of statistical physics inspired by Boltzmann's statistical  thermodynamics. Under much agony Planck thereby took a step out of classical physics into a new form of statistical physics, which then evolved in the quantum mechanics as the essence of modern physics.

The fundamental step away from classical physics as deterministic physics about existing realities, was the introduction of statistical physics about probabilities of possibilities. From specific existing realities to infinite possibilities.  

In the new digital world the distinction between existing unique reality and virtual realities is blurred which means that difference between classical deterministic reality and modern probabilistic possibility is also blurred. 

It is thus of interest to seek to pin down the difference between (i) classical physics as existing realities and (ii) modern physics as probabilities of possibilities. A striking aspect is that (i) does not require any human minds/observers (the Moon is there even when nobody looks at it), while (ii) requires some form of mind to carry/represent thinkable possibilities.  

Quantum Mechanics QM emerged before the computer and so computational aspects were not in the minds of the pioneers Bohr-Born-Heisenberg, who came to develop the Copenhagen Interpretation CI formed in the 1920s based on a multi-dimensional wave function $\Psi (x,t)$ depending on a spatial coordinate $x$ with $3N$ dimensions for an atomic system with $N$ electrons satisfying a linear Schrödinger Equation SE (and $t$ is a time coordinate), with $\vert\Psi (x,t)\vert^2$ interpreted as a probability density over configuration space with coordinate $x$. This is still the text book version as Standard QM StdQM.

The many dimensions makes the wave function $\Psi (x,t)$ uncomputable and so has existence only in the mind of a CI physicist with unlimited fantasy. The grand project of StdQM can thus be put in question from computational point of view, and also from realistic point of view if we think that the evolution of the World from one time instant to a next is the result of some form of analog computational process performed by real atoms.

The World is thus equipped with (analog) computational power allowing evolution in time of the existing reality, but it is hard to believe that it has capacity for exploration of all possibilities to form probabilities of possibilities, unless you are believer in the Many-Worlds Interpretation as an (unthinkable) alternative to CI.

From computational point of view StdQM as all possibilities is thus hopeless. The evolution of the multi-dimensional wave function $\Psi (x,t)$ in time is an impossible project. What is today possible is exploration of thinkable realities as long as they are computable.

The exploration can be done starting from RealQM as a computable alternative to StdQM. To see that it is not necessary to take the full step into the the impossibility of StdQM, we need explanations of in particular (1) Wien's Displacement Law and (2) Photoelectric effect, in terms of classical deterministic physics. This is offered on Computational Blackbody Radiation in the form of classical threshold effects.

It thus appears possible to stay within a framework of deterministic classical computable physics and so open to exploration of thinkable worlds of microscopics by computation, which is not possible starting from StdQM.  

Summary: There is a distinction between (a) specific computable (thinkable) realities, and (b) probabilities of uncomputable possibilities. Your choice! As Schrödinger put it: There is a difference between a specific blurred picture and a precise picture of a fog bank.

 

lördag 18 oktober 2025

Quantum Restart 2026 from Hydrogen Atom 1926

This year has been designated as the International Year of Quantum Science and Technology (IYQ2025) by the United Nations as the 100th anniversary of the development of Quantum Mechanics. 

Quantum Mechanics was kick-started fin 1926 with formulation Schrödinger's Equation SE for the Hydrogen atom with one electron,  followed by a swift generalisation to many electrons by Born-Heisenberg-Dirac to form the text book Copenhagen Interpretation CI of Standard QM of today.

StdQM is generally viewed as a formidable success underlying all of modern technology of microscopics, but none of the foundational problems behind the CI have been resolved. StdQM is viewed to "always work perfectly well" but "nobody understands why". 

The previous post recalled the critical moment in 1926 when SE was generalised to many electrons by Born-Heisenberg-Dirac into StdQM under heavy protests from Schrödinger, who took the first step with a SE in a wave function $\Psi (x)$ depending on a 3d space coordinated $x$ with $\rho (x)=\Psi^2 (x)$ representing charge density in a classical sense. 

Recall that RealQM is a generalisation to many electrons different from StdQM by staying within a framework of classical continuum mechanics in the spirit of Schrödinger. The basic assumption is that an atom with $N$ electrons is represented by a nucleus surrounded by a collection of electrons as    

  • non-overlapping unit charge densities $\rho_i(x)$ for $i=1,....,N$, 
  • free of self-interaction,
  • indivisible in space. 
Let us now compare RealQM and StdQM in the case of Hydrogen. For stationary ground states and excited states so called eigenstates, they share formally the same SE but with different interpretations of the wave function:

  1. $\rho (x)$ is charge density in classical sense. (RealQM)
  2. $\rho (x)$ is probability density in StdQM sense. (StdQM) 

Recall that QM was formed from a perceived difficulty of capturing the spectrum of Hydrogen within classical physics with the spectrum arising from interaction of the atom with an exterior forcing electromagnetic field in so called stimulated radiation. 

Schrödinger resolved this problem by extending SE to a time-dependent form where the frequencies of the spectrum appeared as differences of stationary energy levels, thus with a linear relation between atomic energy levels and resonance frequencies in stimulated radiation. The discrete frequencies appeared as 

  • beat frequencies of wave functions in superposition. 
This became the mantra of StdQM which has ruled for 100 years, with superposition signifying the break with classical physics, where superposition in spatial sense is impossible.

If we stay within RealQM, then superposition is impossible because charge densities do not overlap. We now ask the key question:
  • Is it possible to capture the spectrum of Hydrogen within RealQM thus without superposition? 
The discrete stationary eigenstates are the same, and so we ask about the time-dependent form of RealQM? Is it the same as that of StdQM? Not in general because RealQM is non-linear and StdQM linear. For Hydrogen RealQM is linear so in this case the same time-dependence as in StdQM is possible.

But this may not be most natural from a classical point of view without superposition in mind. Instead it is natural to think of the radiating electron oscillating back and forth between two energy levels with different charge densities as a classical oscillating dipole. We can thus extend RealQM to a classical dynamical system swinging back and forth between energy levels with different charge distributions. This would describe the radiating Hydrogen atom in terms of classical physics with a continuous transition between different configurations. This would answer Schrödinger's basic question without answer in STdQM about "electron jumps": The electron does not jump but changes charge density continuously in space and time. 

The only thing to explain in this scenario is the linear relation between (difference of) energy and frequency, not from beat frequency and superposition, but from the basic relation between energy and frequency appearing in Planck's Law discussed in this post. 

Summary: It seems possible to capture atomic radiation by RealQM within a classical continuum mechanics framework and so avoid taking the step out of classical physics along the dream of Schrödinger. In particular, superposition is not required and probably not present. Quantum computers built on superposition will not work. Superposition may be superstition rather than reality.  

fredag 17 oktober 2025

The Tragedy of Schrödinger (and QM)

The atomic world was opened to theoretical exploration in 1926 when Erwin Schrödinger formulated a mathematical model of the Hydrogen atom with one electron in terms of a wave function $\Psi (x)$ depending on a 3d spatial coordinate $x$ with $\Psi^2 (x)$ representing electron charge density. So could Schrödinger represent the ground state of the Hydrogen atom by the real-valued wave function $\Psi (x)$ minimising the total energy 

  • $E(\psi )=E_{kin}(\psi )+E_{pot}(\psi )$
where 
  • $E_{kin}(\psi )=\frac{1}{2}\int\vert\nabla\vert^2dx$  is kinetic (electron "compression") energy 
  • $E_{pot}(\psi )=-\int\frac{\psi^2(x)}{\vert x\vert}dx$ is Coulomb potential energy
over all functions $\psi (x)$ defined in 3d Euclidean space with $\int\psi^2(x)dx=1$. Using his solid knowledge of Calculus, Schrödinger was very pleased to find the solution in analytical form:
  •  $\Psi (x)=\frac{1}{\sqrt{\pi}}\exp(-\vert x\vert )$. 

In an outburst of creativity in the Alps in the Winter 25-26 together with one of his girlfriends, Schrödinger generalised to a time dependent form capturing the observed spectrum of Hydrogen, and the success was total as a first glimpse into the atomic world to form the new focus of modern physics.

The next step beyond Hydrogen was Helium with two electrons. What could the wave function look like for two electrons?  How to generalise from one to many? Two options presented themselves:

  1. Real physics: Add a new non-overlapping charge density for each new electron. 
  2. Formal mathematics: Add a new set of 3d spatial coordinates for each new electron. 
Schrödinger contemplated 2. but did not think it was the right way to go because the wave function for Helium would involve 6 spatial dimension and so lack physical representation. For some reason Schrödinger did not pursue 1. either. So right after the big success with Hydrogen Schrödinger hit the wall. 

But other physicists were ready to quickly jump in (1927) following 2. with enthusiasm:
  • Born-Oppenheimer: Formal generalisation without Pauli Exclusion Principle PEP. Probabilistic interpretation of wave function.
  • Heisenberg-Dirac: Addition of PEP, antisymmetry and Slater determinants.

This forms the basis of Standard QM StdQM also today in its Bohr-Born-Heisenberg text book Copenhagen Interpretation.  

To Schrödinger the take-over of his baby came as a shock:  

  • I am not happy with the probability interpretation. In my opinion, it is an ephemeral way to avoid the true problem.
  • The whole antisymmetrization seems to me to be a desperate expedient to save the particles’ individuality. Perhaps it is not fundamental but only an approximation.
  • The use of Slater determinants in atomic theory seems to obscure the physical picture even more.
  • I don’t like it, and I’m sorry I ever had anything to do with it.
  • The $\psi$-function as it stands represents not a single system but an ensemble of systems. It does not describe a state of one system in configuration space but rather an ensemble of systems in ordinary space. The $\psi$-function itself, however, does not live in ordinary space, but in the configuration space of the system. And not merely as a mathematical device — it really exists there. That is what is so repugnant about it.

Schrödinger thus quickly became incompatible with StdQM and accordingly was marginalised, along with Einstein sharing similar criticism.

RealQM represents a new initiative to follow 1. as real physics in the spirit of Schrödinger. What would Schrödinger have said about RealQM which lay on the table already in 1927? What would have happened if Schrödinger had been allowed to take care of his own baby and not given it away others? 


torsdag 16 oktober 2025

From Abstract to Concrete by Computation

The computer changes practice of science and technology. Let us see if the computer also changes the nature and role of theory as expressed in mathematical models. 

A classical Newtonian paradigm is to formulate a model as a set of differential equations describing laws of physics such as Newton's laws of motion, as  a dynamical system with state $U(t)$ depending on time $t$ satisfying (with the dot signifying differentiation with respect to time):

  •  $\dot U(t)=F(U(t))$ for $0<t\le T$ with $U(0)$ given and $T$ a given final time,      (N)

where $F(v)$ is a given function of $v$. We call the function $U(t)$ the trajectory of the system.

Calculus was developed to solve (N) using symbolic mathematics of integrals and derivatives, which worked for a limited set of dynamical systems. When symbolic solution failed or was too complicated numerical solution could always be used in the form of time-stepping

  •  $U(t+dt) = U(t)+dtF(U(t))$ with $dt>0$ a small time step,               (C)
allowing successive computation of $U(t)$ for $0<t\le t)$, without the use of symbolic Calculus.

Before the computer (C) could be too time consuming, and so the symbolic Calculus was developed by reformulating (N) into a new paradigm of Lagrangian mechanics, where a specific dynamical system solution $U(t)$ was not described by (N) but instead by a Variational Principle VP  stating that the integral

  • $\int_0^TL(u(t))dt$                               (VP)

with Lagrangian $L(v)$ (determined by $f(v)$) depending on an arbitrary trajectory $u(t)$, has a vanishing small variation under small variations of $u(t)=U(t)$. 

The differential equation (N) with one specific solution $U(t)$ was thus replaced by a VP including variation over many trajectories $u(t)$. The generality of VP formulation turned the 18th century into Lagrangian mechanics, thus from (N) to (VP). This was a step towards abstraction from concrete Newtonian mechanics to Lagrangian mechanics based on abstract VP.

We have seen that (N) has a natural computational form as (C), while the computational form of a VP is not direct since comparison over a rich variation is not efficient.

With the computer there is thus today a shift from VP back to (N). From abstract to concrete, because computation is concrete like taking yet another time step forward.

This has important implications because the generalisation to classical mechanics to quantum mechanics has followed the path of Lagrangian abstraction put to an extreme in the Quantum Field Theory by Feynman as "sum over all paths".

In particular the generalisation of Schrödinger's equation from one electron to many electrons took an abstract path into multi-dimensional wave functions, which has haunted Quantum Mechanics from start. 

RealQM offers a concrete generalisation which directly lends itself to computation.

Summary: Computation takes concrete form and so naturally connects to concrete differential equations formulation rather than to some abstract variational principle. 

PS1 Recall the light can be viewed to propagate following a principle of least time (without direct computational realisation), as an alternative to wave propagation (with direct computational form).

PS2 Certain configurations can be characterised as minimising energy, which can be resolved computationally by gradient method as a form of time-stepping.

  



onsdag 15 oktober 2025

How Computation Changes Theoretical Physics of TD and QM

The physical theories of Thermo Dynamics TD and Quantum Mechanics QM were both formed before the computer, and so do not include the aspect of computation, computability and computational work. Both theories focus on separate equilibrium states rather than dynamical evolution between different states. Rather statics than dynamics, because dynamics is more demanding by including evolution in time.  

The computer changes the game by offering computational power allowing computational simulation of evolution in time of dynamical systems and so opens to a better understanding of the World as it evolves from one instant of time to a next in a process of time-stepping. 

The change is fundamental and opens entirely new possibilities and also resolution of fundamental unresolved problems in TD and QM connected to the static nature of these theories in standard form. 

So can the 2nd Law of TD be given an explanation based on finite precision computation confronting instability as explained in Computational Thermodynamics

So can the basic unresolved foundational problems troubling QM since 100 years, be circumvented by a reformulation into Real Quantum Mechanics RealQM which can be explored in dynamical form by computation. 

Let us here focus on the dynamical aspects of RealQM, which come in two forms (i) time-periodic and (ii) dynamical evolution between equilibrium states. 

RealQM takes the following form for an atomic system consisting of $N$ electrons as non-overlapping unit charge densities and a set of atomic nuclei for simplicity as particles at fixed positions, interacting by Coulomb forces, which can be described by a complex-valued wave function $\psi (x,t)$ depending on 3d spatial coordinate $x$ and time $t$ satisfying a Schrödinger Equation SE of the form 

  • $i\dot\psi (t)+ H\psi (t) = f(t)$       (SE)
where the dot denotes differentiation with respect to time, $H$ is a Hamiltonian acting on $\psi$ and $f(t)$ is an exterior driving force. Here $\vert\psi (x,t)\vert^2$ has a direct physical meaning in 3d space as charge density. 

We note that (SE) has the form of a classical dynamical system in a function $\psi (x,t)$ with direct physical meaning depending on a 3d spatial coordinate $x$ and $t$.  Given an initial value $\psi (x,0)$ the value of $\psi (x,t)$ at a later time $t>0$ can be determined by resolving (SE) by time-stepping with computational work scaling linearly with $N$ (in the case of RealQM but exponentially in QM.

A time-periodic solution (with $f(t)=0$) can take the form 
  •  $\psi (x,t)=\exp(iEt)\Psi (x)$. 
  • $H\Psi =E\Psi$. 
  • $\Psi$ eigenstate and $E$ (real) eigenvalue as energy.
Here the eigenstates appear as static states. The eigenstate with smallest energy is the groundstate of the system. It is possible to compute the groundstate by parabolic relaxation in the form of time-stepping of 
  • $\dot\Psi + H\Psi =0$ with renormalisation to unit charge. 
which can be seen as a gradient method towards minimum energy as an actual physical process when an atom or molecule finds its minimum energy equilibrium state.

But (SE) opens to computational simulation of genuine dynamical evolution between physical states described by charge density under exterior forcing. 

RealQM thus offers a new capacity of computational  simulation of complex atomic systems in terms of charge density as real physics with clear meaning.

RealQM should be compared with StdQM based on a multi-dimensional Schrödinger Equation StdSE with only probabilistic physical meaning with exponential computational complexity requiring drastic reduction into physics with unclear meaning.

Notice that RealQM stays within the classical world of continuum physics and so does not meet the unresolvable problems of StdQM of (i) meaning of wave function, (ii) role of measurement and (iii) computational complexity. There is no need of any special Philosophy of RealQM as for StdQM. 

Summary: Computation offers a new tool to simulation and understanding of atomic systems when applied to RealQM as a computable model with clear physics. Thus 100 years after conception QM may take a leap into a new era of Computational QM, leaving the unresolved foundational problems of StdQM behind as irrelevant. 

PS The typical reaction to RealQM is not a welcome as possibly something offering new capabilities and relief from old troubles, but rather the opposite as an unwanted disturbance to a status quo in full agreement that "nobody understands QM" as a "soft pillow" in the words of Einstein. 


tisdag 14 oktober 2025

Physics as Becoming as Computational Process

Recents posts have discussed the role of Planck's constant $h$ in Standard Quantum Mechanics StdQM presented as the smallest quantum of action as one of Nature's deepest secrets. It can thus be of interest to seek to understand the concept of action as formally energy x time or momentum x length in combinations without clear physical meaning.

Let us then ask if in physics concepts which have a more or less direct physical representation, have a special role? We thus compare concepts like mass, position, time, length, velocity, momentum,  and force, which have physical representations, with the concepts of energy and action, which are not carried the same way in physical terms.    

To seek an answer recall that in the age of the computer it is natural to view the World as evolving from one time instant to a next in processes involving exchange of forces, which can be simulated in computations involving exchange of information, as computational dynamical systems where the dynamics of the World is realised/simulated in time stepping algorithms. Stephen Wolfram has presented such a view. It is a computational form of the general idea of a World evolving in time from one time instant to the next.

This is a World of becoming with focus shifting from what the World is to what the World does, from state to process. 

The time-stepping process for the evolution of the state of a system described by $\Psi (t)$ from time $t$ to time $t+dt$ with $dt$ a small time step, takes the form 

  • $\Psi (t+dt) =\Psi (t)+dt\times F(t)$   (or $\frac{d\Psi}{dt} = F$)   (P)

where $F(t)$ represents the force acting on the system at time $t$, which may also depend on the present state $\Psi (t)$. We speak here about 

  • state $\Psi (t)$
  • force $F(t)$ 
  • process (P).
We see that the concept of state (what is) is still present, but we can bring forward the process (becoming) to be of main concern including the force $F(t)$. 

We can describe such a world as a Dynamical Newtonian World based on Newton's Law
  • $\frac{dv}{dt}=\frac{f}{m}$ or $v(t+dt)=v(t)+dt\times F(t)$,
with $v(t)$ velocity, $m$ mass, $F(t)=\frac{f(t)}{m}$ and $f(t)$ force. 

This is the ever-changing world of Heracleitos based on state and force and process with physical representations. 

But there is also the world of Parmenides as a static world as Einstein's space-time block Universe. 

The idea of a space-time block Universe is present in the minds of theoretical physicists speaking about physics governed by a Principle of Stationary Action as 
  • stationarity of $A(\Psi )\equiv\int_0^T L(\Psi (t))dt$      (PSA)
where $A(\Psi )$ is action, $L$ is a Lagrangian depending on $\Psi (t)$ and $t=0$ is an initial time and $T$ a final time for the dynamical system $\Psi (t)$. PSA means that the actual evolution $\bar\Psi$ is characterised by vanishing change of the total action $A(\Psi )$ under small variations of $\Psi$ of $\bar\Psi$. We note that the action $A(\Psi )$ does not have a direct physical representation but requires a counting clerk to take specific value.

PSA is not realised by computing $A(\Psi )$ for all $\Psi$ and then chosing the true $\bar\Psi$ from stationarity, since the amount of computational work is overwhelming. Instead PSA is realised by time-stepping as a form of (P) with suitable $F$.

Before the computer a problem formulation in terms of PSA was often preferred because the Lagrangian had a given analytical form allowing (P) to be formulated and the $\Psi (t)$ could be determined analytically. But the range of applications was very limited.

With the computer, the focus shifts to (P) allowing unlimited generality. The shift is from PSA where action does not have a physical representation to (P) with physical representation.

Let us now return to $h$ as the smallest quantum of action, with the experience that action is a concept in the head of a counting clerk without direct physical representation and so as the smallest quantum of action. 

This adds to the discussion in recent posts questioning the role of Planck's constant as a fundamental constant of Nature. It does not seem to be so fundamental after all. There is no smallest quantum of action in Nature.

We can compare (P) with a computational gradient method to solve to find an equilibrium state characterised by minimal energy, again with physical representation of (P) but not of energy.