torsdag 6 november 2025

The Black Hole of Quantum Computing

Standard Quantum Mechanics StdQM based on a multi-dimensional Schrödinger Equation SE is viewed to have exponential digital computational complexity effectively making SE uncomputable even on super-computers and thus useless for digital modeling of atoms/molecules in practice.

Quantum computing is an attempt to model atoms and molecules instead by analog computation performed on quantum computers capable of meeting exponential complexity. A quantum computer models a physical quantum system of atoms/molecules not by SE, but by another (simpler) physical quantum system which is controllable as being described by SE. 

A quantum computer thus offers a model/map of a real system which is itself a real system of the same form and type. The quantum computer model can be viewed as a 1:10 exact physical model of an airplane or ship with some details removed. The digital computational complexity of the original physical system is then irrelevant since no digital computation is performed, only a form of analog computation performed in a physical model of essentially the same computational complexity. 

Quantum computing thus represents a step back from the scientific revolution based on Calculus offering  mathematical models of reality of abstract symbolic from based on numbers which can be made alive by digital computing. 

Quantum computing throws away digital Calculus because it is found to be useless, and seeks a replacement which works in terms of analog computation. 

But is it possible that real quantum systems really perform analog computations of exponential complexity? If a digital computer does not have the capacity to meet exponential complexity, what says that a real system as it evolves somehow can express exponential complexity? 

We know that the exponential complexity of SE comes from the multi-dimensionality of the wave function representing possibilities rather than actualities. But real systems do not realise all possibilities only a few actualities, and so the computational complexity of SE as model is much bigger than the complexity of what is modeled. The model/picture totally swamps the origin and so asks for infinite digital computational power to no avail as concerns the origin.

RealQM is a version of the quantum mechanics of atoms and molecules, which is digitally computable and so does not need quantum computing.


onsdag 5 november 2025

Quantum Computing as Theoretical Fantasy

Here is a summary by chatGPT of Dyakonov's view on the realism of quantum computing:

Physicist Mikhail Dyakonov has been one of the most prominent skeptics of large-scale quantum computation. His main critique can be summarized as follows as of 2019:

  1. Extreme Fragility of Quantum States – Quantum bits (qubits) are extraordinarily sensitive to any external disturbance. Maintaining coherent quantum states long enough for complex computations is, in his view, practically impossible.

  2. Error Correction Limits – While quantum error correction is theoretically possible, Dyakonov argues that the real-world requirements (millions of physical qubits per logical qubit, perfectly calibrated control systems, etc.) make it infeasible.

  3. Unrealistic Scaling – He believes scaling from a few dozen qubits to millions (as required for useful computation) involves an exponential explosion of complexity in control, synchronization, and noise management.

  4. Overly Optimistic Theory vs. Practice – Dyakonov claims that most of quantum computing theory assumes idealized conditions that cannot be realized physically. The engineering challenges are, in his words, “beyond human capabilities.”

  5. Skepticism Toward “Quantum Supremacy” Claims – He views demonstrations like Google’s 2019 experiment as contrived and not indicative of practical progress toward general-purpose quantum computing.

In essence, Dyakonov argues that large-scale quantum computers are a theoretical fantasy, comparable to perpetual motion machines — possible on paper, but not in reality.

end chatGPT

Let us compare with the latest hype:

  • Google’s Willow Chip Achieves Historic Quantum Computing Breakthrough
  • Willow Chip Powers Verifiable Quantum Advantage, Running Algorithm 13,000x Faster Than Supercomputers.
This what chatGPT says about the Willow Chip:
  • Dyakonov’s scepticism remains valuable and largely justified — many of his fundamental concerns (noise, control, scale, assumption-validity) are not yet fully overcome.
  • The Willow chip is important: it represents a tangible, hardware-driven step toward the goals of large-scale quantum computing. It shows we are making progress on the error-correction and scaling front.
  • However, Willow does not yet fully invalidate Dyakonov’s concerns. It addresses some, but many of the major leaps (millions of qubits, full fault-tolerance, broad practical algorithmic supremacy) are still ahead.
  • In other words: Willow moves the bar, but the “gap” that Dyakonov insisted on remains significant. The achievement doesn’t guarantee the remaining engineering challenges will vanish.
  • If I were to pick one key takeaway: Willow demonstrates that at least some of the previously theoretical obstacles (error-correction scaling) can be approached in hardware — which means the sceptical “maybe impossible” thesis is being challenged. 
  • But “practically useful, general-purpose quantum computers in the near-term” still remain uncertain.
It seems that quantum computing is still a theoretical fantasy, 50 years after the idea was launched. 

Recall that quantum computing is based on unitary evolution of quantum systems of thousands of qubits in superposition of possibilities as fantasy physics: Compare with the first proposition in Wittgenstein's Tractatus:
  • The World is all that is the case.
It is clear that "to be the case" requires more than what is present in a quantum system of possibilities, which means that according to Wittgenstein a quantum computer does not belong to the World. But a quantum computer is an analog computer and as such must belong to the World. Wittgenstein would thus view the Willow chip with utter skepticism. And you?

Recall that the idea of a quantum computer is a model of an uncontrollable real/analog quantum system as part of the World in the form of a controllable real/analog quantum system as part of the World, with the caveat that the model is not "the case" because it plays with possibilities and not with realities.   

Notice that this contradiction does no appear with a digital computer because the computing is abstract mathematical and so does not need real analog computing.  


måndag 3 november 2025

Hydrogen Spectrum as Classical Physics

This is a continuation of the previous post on the necessity to give up classical physics for quantum physics in its text book form of Standard Quantum Mechanics StdQM. We ask to what extent RealQM as a form of classical physics can explain the observed spectrum of the Hydrogen atom as expressed in stimulated radiation. We thus will compare

  • StdQM: Spectral line appears from superposition of wave functions of eigenstates. 
  • RealQM: Spectral line appears from oscillation between charge distributions of eigenstates.
In both cases the frequency of the spectral line scales with the difference of energy between eigenstates, but with different explanations: 
  • StdQM: Spectral frequency appears as beat frequency of wave functions with eigenfrequency variation in time according to the Schrödinger's equation. Connection between frequency and energy is secondary. Radiation can appear to be spontaneous.
  • RealQM: Spectral frequency is not beat frequency, but simply the frequency $\nu=\frac{\Delta E}{h}$ which matches the energy difference $\Delta E$ between eigenstates with $h$ Planck's constant in an assumed coupling between frequency and energy. There is an active exchange of energy between atom and radiation field with frequency matching the jump in energy. The atom is forced to respond to radiation of certain frequency as a dipole. The radiation is not spontaneous. 
We see that StdQM offers an explanation in terms of time-dependent quantum mechanics without realism, while RealQM relies on the formal coupling between matter and radiation expressed by $E=h\nu$ appearing in blackbody radiation. Compare with this post on the physical meaning of $E=h\nu$.

We see that, at least in the case of stimulated radiation, the spectrum of an atom can be given a RealQM semi-classical explanation. It is not clear that StdQM offers something more enlightening. Or?

This discussion connects to quantum computing discussed in recent posts with StdQM supposed to support delicate superposition of wave functions free of forcing performing complex analog computations, while RealQM brings forward the aspect of forcing in terms of classical physics. 

PS Here is a chatGPT comment.

The Double-Slit Experiment as Classical Physics

The double-slit experiment with one single photon at a time is the sledge hammer which is designed to kill classical physics and so open to a new form of physics named quantum mechanics as the essence of modern physics.

The experiment consists of sending a sequence of photons one at a time through a double slit in a thin wall and recording the emergence of a macroscopic interference (fringe) pattern on a screen behind assuming a new dot for each new photon. 

The pattern is the same as the interference patterns generated by a classical macroscopic wave hitting the wall and generating two macroscopic waves from the slits which interact constructively and destructively and so form a macroscopic fringe pattern on the screen.

We thus have a quantum version of the experiment with one photon at a time and a classical version, both building the same macroscopic fringe pattern on the screen. 

The catch of the quantum explanation of the quantum version is that the single photon after somehow passing through the double slit with a left and right slit, somehow appears after the slit in a superposition of "left pass" and "right pass" as two mutually exclusive possibilities carrying an interference pattern of possibilities, which is made real dot by dot on the screen (by "collapse of the wave function").

The sledge hammer argument is now that the quantum version cannot be given a classical explanation because a classical particle/phtoton has to make the choice of "left pass" or "righ pass" which will remain after passing and so does not form any interference pattern, just two one-slit patterns. 

The strategy is thus to promote quantum physics by an experiment arranged to show deficiency of classical physics. 

Let us take a closer look at the quantum version asking what the physical meaning of "one photon at a time" may be. What is the evidence that only one photon at a time reaches the double slit? How to produce exactly one photon? 

It seems that the evidence of "one photon at a time" is that there is "one dot at a time appearing on the screen". The argument is thus that one dot is generated by exactly one photon. Is this really so sure?

What if many photons are in fact involved with the capacity to pass both slits and form a classical real but very weak interference pattern before hitting the screen with a random excitation of one dot. The calibration of the intensity of the source would then be adjusted to produce one dot at a time, which de facto involves many photons capable of generating an intereference pattern with random realistion.

It thus may be possible to explain a "one dot at a time" double slit experiment in terms of classical wave physics. The sledge hammer argument would then vaporise.  

PS After a discussion with chatGPT we agree that the one-photon experiment is extreme, but we seem to disagree about the value of extreme experiments to guide development of non-extreme physics. Why not stick to non-extreme experiments for non-extreme physics? What can be learn about normality by pushing into non-normality? What to learn about normal life from extreme violence? chatGPT seems to believe in the value of extremism in physics more than I. And you?      

söndag 2 november 2025

Why Is Analog Quantum Computing Needed?

Quantum computing is motivated by a perception that simulating atomic physics described mathematically by the Schrödinger Equation SE of Quantum Mechanics QM, is exponentially hard and so is impossible. This is because SE for a system with $N$ electrons involves $3N$ spatial dimensions with computational work increasing exponentially with $N$.

In other words, digital simulation of QM is viewed to be so computationally demanding that the alternative of analog simulation must be explored. This is the idea analog quantum computing launched by Richard Feynman 50 years ago:

  • Simulate a real quantum system by a controllable laboratory quantum system. 
This is the same idea as testing a physical model of a real airplane in a wind tunnel under controllable conditions. Or building a toy model of a bridge and testing its bearing capacity. No mathematics is needed, just craftsman skill.

The basic idea is thus to give up building mathematical models of realities in terms of Cartesian geometry based on numbers with digital representation, as the scientific method behind the evolution of the modern industrial/digital society. 

Such a step can be seen as a step back to a more primitive science based on analog modeling without mathematics. 

In any case, massive investment is now going into creating quantum computers as controllable analog quantum systems. The design work has to cope with the perceived impossibility to test different designs using mathematical digital modeling, and so has to rely on tricky experimental testing. The time frame for a useful analog quantum computer appears to be decades rather than years.

With this perspective it is natural to ask if the exponential computational complexity of the microscopics of quantum mechanics is written in stone. Macroscopics of continuum mechanics rarely comes with exponential complexity, because evolving a macroscopic system, like the weather, one time step involves only local connections in 3 space dimensions which has polynomial complexity. 

If macroscopics has polynomial complexity, then microscopics on smaller scales should have as well. RealQM offers a version of quantum mechanics of polynomial complexity. If nothing else, it can be used to test different designs of an analog quantum computer. Want to try RealQM?

Another mission of analog quantum computing put forward to motivate investors, is improved potential of factorisation of large natural numbers with promise to break cryptography codes. But analog computation about properties of numbers instead of digital appears far-fetched.

PS Recall that at each clock cycle
  • a digital computer operates on $n$ factual states
  • a quantum computer operates on $2^n$ possible states  
with simplistic promise of an enormous increase of capacity from linear to exponential. Is it too good to be true?

lördag 1 november 2025

Quantum Computing Without Mathematics

Schrödinger's Equation SE for the Hydrogen atom with one electron formulated in 1926 by the Austrian physicist Erwin Schrödinger as a model of a negative electron charge density subject to Coulomb attraction from a positive kernel,  was generalised to atoms with many electrons by a formal mathematical procedure adding a new independent 3d spatial Euclidean space for each electron into a linear multi-dimensional SE with $3N$ spatial dimensions for an atom with $N$ electrons, to form the foundation of the modern physics of Quantum Mechanics. 

The mathematics of the multi-d SE was quickly formalised by the mathematician von Neumann into highly abstract functional analysis in Hilbert spaces as a triumph of symbolic abstract mathematics. Physics of real atoms was thus hijacked by mathematicians, but the task of making physical sense of the abstraction was left to physicists without the mathematical training required to make real sense of von Neumann's functional analysis. The problem of physical interpretation remains unresolved today, which is behind the present manifest crisis of a modern quantum physics hijacked by mathematics.

The multi-d SE showed to harbour a serious problem when confronted with mathematical computation. Because of the many dimensions the computational complexity showed to be exponential, which made SE uncomputable on digital computers, and so in effect useless. 

Abstract mathematics had created a model of real physics, which showed to be an uncomputable monster, which was not useful except as an exercise of functional analysis.

Quantum computing is a new form of computing fundamentally different from digital computing as mathematical computing with numbers. The idea was launched in the 1970s by the physicist Richard Feynman as a new approach to tackle the uncomputability of QM. The radical idea was to replace uncomputable functional analysis by a form of analog quantum computation, where a real atomic quantum systems is modeled in a laboratory by another real quantum system acting as analog quantum computer.

Recall that the option of replacing a mathematical model by an analog model is also used classically, when a model of an airplane is studied in a wind tunnel instead of solving the Navier-Stokes equations deemed to be impossible.

The success of mathematisation of quantum physics into functional analysis in Hilbert spaces hundred years ago carried its own destruction by coming with exponential complexity, which could not be met within mathematical computation. 

Heavy investment in is now being directed into building a quantum computer supposed to function according to a mathematical formalism, which is being replaced by analog quantum computing.

Does this appear to be strange? To build on mathematical quantum mechanics which is replaced by analog quantum computing based on what was replaced? What would Wittgenstein say about something like that? In any case the investment in quantum computing is high risk. 

RealQM offers a way out of this mess, in the form of a different Schrödinger equation which is computable as digital mathematics and thus does not have to be replaced by analog quantum computing.  Why not give it a try?

 

måndag 27 oktober 2025

Is Quantum Computing Possible?

Quantum superposition is the crucial component of Quantum Mechanics believed to open a new world of quantum computing on quantum computers.    

An $n$-qubit quantum computer with $n=2,3,..,$ is supposed to operate in parallel on $2^n$ states in superposition, to be compared with a classical digital computer operating on $n$ bits at a time, where a bit can take the value 0 or 1. Quantum computing thus gives promise of exponential speed-up vs digital computing.   

The idea of quantum computing was first presented by Richard Feynman in an attempt to get around the apparent exponential complexity of Quantum Mechanics QM based on Schrödinger's multi-dimensional wave equation making digital simulation impossible by demanding computational work scaling with $2^N$ for a system with $N$ electrons. The idea was to replace digital simulation with some form of analog quantum computation where the simulation of a quantum system would be performed on a quantum system. An intriguing idea, but could it work? The apparent exponential complexity would then be met with an exponential computational power making simulation possible. To meet a perceived difficulty by ramping up the capacity or simply to "fight fire with fire". 

Let us then consider the basic idea of a quantum computing, which is 

  • Simultaneous operation on states in superposition.       

What then is "superposition"? The mathematical answer is the following: Consider a linear algebraic or differential equation with solutions $\psi_1$ and $\psi_2$. Then the algebraic sum $\psi =\psi_1+\psi_2$ is also a solution and $\psi$ is viewed to be the superposition $\psi_1$ and $\psi_2$ with the + sign signifying algebraic sum. 

As concerns classical physical wave mechanics, the algebraic superposition can take two forms with physicality of (i) the algebraic sum $\psi$ or (ii) of the individual terms $\psi_1$ and $\psi_2$. Case (i) represents the interference pattern seen of the surface of one pond, while (ii) represents the beat interference generated by two vibrating strings with nearby frequencies. 

As concerns QM the physicality is supposed to be displayed in the double-slit experiment: Let a single photon/electron pass a double-slit and then be detected on a screen behind a as a spot. Repeat the experiment many times and notice a fringe pattern appearing on the screen, which is the same interference pattern developed by a macroscopic wave passing through both slits. The photon/electron after passing the double split is representing the quantum superposition supposed to carry quantum computing. This is not a superposition of realities as in (ii) above, but a superposition of possibilities made real by repetition of the one photon/electron experiment which represents the physics of a 1-qubit: The single photon/electron is by passing through the double slits put into a superposition of passing the left slit and passing the right slit not as realities but as possibilities. It is here essential that the superposition only concerns one photon/electron in superposition of $2^1=2$ states. This is supposed to generalise to $n$ entangled photons/electrons in superposition of $2^n$ possible states.

The evidence that quantum computing is possible thus boils down to the double-slit experiment. If one photon/electron can be put in superposition of two states which appears to support interference with fringe pattern, then constructing of a $n$-qubit computer may be possible. If two photons/electrons are needed for two states then we are back to classical computing with bits.

The crucial question is now: Is it sure that because there is one click on the screen at a time, the input is one photon/electron at a time?

Think of this, and return to see a more precise analysis.

QM in its standard multi-dimensional form has exponential complexity, which requires exponential computing power. RealQM is an alternative with polynomial complexity which can be met by classical computing.  

Maybe quantum computing is neither possible nor needed? 


fredag 24 oktober 2025

Quantum Physics as HyperReality = Crisis

Modern physics as Quantum Mechanics QM is based on a (complex-valued) wave function $\Psi (X)$ depending on a $3N$-dimensional configuration space coordinate $X$ for an atomic system with $N$ electrons giving each electron a separate 3d Euclidean space. 

Configuration space is a mathematical construct which does not have any physical representation for $N>1$. The wave function as a function defined on configuration space shares the same lack of physical representation. The wave function evolves in time as a mathematical construct satisfying a Schrödinger equation. 

To give QM a meaning beyond a mathematical game it is necessary to give the wave function a physical meaning, which has drawn much attention without any common view ever being formed. The credibility of modern physics has suffered from this failure and can be seen as the main reason for its present commonly witnessed crisis, which can be summarised as follows: 

  • classical physics = mathematical model with physical origin.   
  • quantum physics = mathematical model without physical origin.     

To handle the lack of physical origin in quantum physics, the logic has been twisted by turning the mathematical model into a new form of reality, which then perfectly fits the model. This connects to the French philosopher Baudrillard's concept of hyperreality as imagined reality formed from a model without origin, when the model is turned into real. 

That this is what de facto happened in quantum physics, is evidenced by the fact that quantum model predictions always perfectly match (imagined) reality.  

We can summarise (see also this post and this post):

  • classical physics: reality is turned into mathematical model
  • quantum physics: mathematical model is turned into hyperreality.
There is a further complication coming from the lack of physical origin/representation of the wave function, namely the conceptual understanding of a mathematical model without origin in some physics connecting to our experience as human beings. In short, how can we capture in our minds a wave-function defined over a $3N$-dimensional configuration space, when our experience is 3d?

The modern quantum physicist thus has to struggle with an imagined reality represented by a mathematical model without real origin as support for imagination. How to imagine a reality when the imagination has nothing relevant to feed from?

There is a way out of this dilemma: Start with RealQM as a quantum model with real origin. 

Recall that we can understand forms of virtual reality without origin but only so far the virtual reality is based on concepts of reality. If the virtual reality displays 6 space dimensions, then we will not be able to properly understand (only pretend to).

Baudrillard uses Disney World as a form of hyperreality as a model of an American society which has never existed, thus without true origin, but yet formed in familiar terms we can understand.   


onsdag 22 oktober 2025

Exploring Microscopics by Computation

Modern physics appeared after a long success story initiated by the scientific revolution culminating at the end of the 19th century in a combination of Newton's mechanics and Maxwell's electromagnetics in full harmony appearing to capture macroscopic physics. 

Of course there were open problems asking for resolution including the ultra-violet catastrophe of black-body radiation viewed to be of particular importance. The start was the Rayleigh-Jeans radiation law stating that radiation intensity scales quadratically with frequency $\nu$, which asks for an upper bound $\nu_{max}$ on frequency to avoid infinite intensity by summation over frequencies. Such a bound was available as Wien's displacement law stating that $\nu_{max}$ scales linearly with temperature $T$. 

The theoretical challenge was to explain the bound $\nu <\nu_{max}$ with $\nu_{max}\sim T$. Planck as leading physicist of the German Empire took on the challenge but was unable to find an answer within classical physics and so resorted to a form of statistical physics inspired by Boltzmann's statistical  thermodynamics. Under much agony Planck thereby took a step out of classical physics into a new form of statistical physics, which then evolved in the quantum mechanics as the essence of modern physics.

The fundamental step away from classical physics as deterministic physics about existing realities, was the introduction of statistical physics about probabilities of possibilities. From specific existing realities to infinite possibilities.  

In the new digital world the distinction between existing unique reality and virtual realities is blurred which means that difference between classical deterministic reality and modern probabilistic possibility is also blurred. 

It is thus of interest to seek to pin down the difference between (i) classical physics as existing realities and (ii) modern physics as probabilities of possibilities. A striking aspect is that (i) does not require any human minds/observers (the Moon is there even when nobody looks at it), while (ii) requires some form of mind to carry/represent thinkable possibilities.  

Quantum Mechanics QM emerged before the computer and so computational aspects were not in the minds of the pioneers Bohr-Born-Heisenberg, who came to develop the Copenhagen Interpretation CI formed in the 1920s based on a multi-dimensional wave function $\Psi (x,t)$ depending on a spatial coordinate $x$ with $3N$ dimensions for an atomic system with $N$ electrons satisfying a linear Schrödinger Equation SE (and $t$ is a time coordinate), with $\vert\Psi (x,t)\vert^2$ interpreted as a probability density over configuration space with coordinate $x$. This is still the text book version as Standard QM StdQM.

The many dimensions makes the wave function $\Psi (x,t)$ uncomputable and so has existence only in the mind of a CI physicist with unlimited fantasy. The grand project of StdQM can thus be put in question from computational point of view, and also from realistic point of view if we think that the evolution of the World from one time instant to a next is the result of some form of analog computational process performed by real atoms.

The World is thus equipped with (analog) computational power allowing evolution in time of the existing reality, but it is hard to believe that it has capacity for exploration of all possibilities to form probabilities of possibilities, unless you are believer in the Many-Worlds Interpretation as an (unthinkable) alternative to CI.

From computational point of view StdQM as all possibilities is thus hopeless. The evolution of the multi-dimensional wave function $\Psi (x,t)$ in time is an impossible project. What is today possible is exploration of thinkable realities as long as they are computable.

The exploration can be done starting from RealQM as a computable alternative to StdQM. To see that it is not necessary to take the full step into the the impossibility of StdQM, we need explanations of in particular (1) Wien's Displacement Law and (2) Photoelectric effect, in terms of classical deterministic physics. This is offered on Computational Blackbody Radiation in the form of classical threshold effects.

It thus appears possible to stay within a framework of deterministic classical computable physics and so open to exploration of thinkable worlds of microscopics by computation, which is not possible starting from StdQM.  

Summary: There is a distinction between (a) specific computable (thinkable) realities, and (b) probabilities of uncomputable possibilities. Your choice! As Schrödinger put it: There is a difference between a specific blurred picture and a precise picture of a fog bank.

 

lördag 18 oktober 2025

Quantum Restart 2026 from Hydrogen Atom 1926

This year has been designated as the International Year of Quantum Science and Technology (IYQ2025) by the United Nations as the 100th anniversary of the development of Quantum Mechanics. 

Quantum Mechanics was kick-started fin 1926 with formulation Schrödinger's Equation SE for the Hydrogen atom with one electron,  followed by a swift generalisation to many electrons by Born-Heisenberg-Dirac to form the text book Copenhagen Interpretation CI of Standard QM of today.

StdQM is generally viewed as a formidable success underlying all of modern technology of microscopics, but none of the foundational problems behind the CI have been resolved. StdQM is viewed to "always work perfectly well" but "nobody understands why". 

The previous post recalled the critical moment in 1926 when SE was generalised to many electrons by Born-Heisenberg-Dirac into StdQM under heavy protests from Schrödinger, who took the first step with a SE in a wave function $\Psi (x)$ depending on a 3d space coordinated $x$ with $\rho (x)=\Psi^2 (x)$ representing charge density in a classical sense. 

Recall that RealQM is a generalisation to many electrons different from StdQM by staying within a framework of classical continuum mechanics in the spirit of Schrödinger. The basic assumption is that an atom with $N$ electrons is represented by a nucleus surrounded by a collection of electrons as    

  • non-overlapping unit charge densities $\rho_i(x)$ for $i=1,....,N$, 
  • free of self-interaction,
  • indivisible in space. 
Let us now compare RealQM and StdQM in the case of Hydrogen. For stationary ground states and excited states so called eigenstates, they share formally the same SE but with different interpretations of the wave function:

  1. $\rho (x)$ is charge density in classical sense. (RealQM)
  2. $\rho (x)$ is probability density in StdQM sense. (StdQM) 

Recall that QM was formed from a perceived difficulty of capturing the spectrum of Hydrogen within classical physics with the spectrum arising from interaction of the atom with an exterior forcing electromagnetic field in so called stimulated radiation. 

Schrödinger resolved this problem by extending SE to a time-dependent form where the frequencies of the spectrum appeared as differences of stationary energy levels, thus with a linear relation between atomic energy levels and resonance frequencies in stimulated radiation. The discrete frequencies appeared as 

  • beat frequencies of wave functions in superposition. 
This became the mantra of StdQM which has ruled for 100 years, with superposition signifying the break with classical physics, where superposition in spatial sense is impossible.

If we stay within RealQM, then superposition is impossible because charge densities do not overlap. We now ask the key question:
  • Is it possible to capture the spectrum of Hydrogen within RealQM thus without superposition? 
The discrete stationary eigenstates are the same, and so we ask about the time-dependent form of RealQM? Is it the same as that of StdQM? Not in general because RealQM is non-linear and StdQM linear. For Hydrogen RealQM is linear so in this case the same time-dependence as in StdQM is possible.

But this may not be most natural from a classical point of view without superposition in mind. Instead it is natural to think of the radiating electron oscillating back and forth between two energy levels with different charge densities as a classical oscillating dipole. We can thus extend RealQM to a classical dynamical system swinging back and forth between energy levels with different charge distributions. This would describe the radiating Hydrogen atom in terms of classical physics with a continuous transition between different configurations. This would answer Schrödinger's basic question without answer in STdQM about "electron jumps": The electron does not jump but changes charge density continuously in space and time. 

The only thing to explain in this scenario is the linear relation between (difference of) energy and frequency, not from beat frequency and superposition, but from the basic relation between energy and frequency appearing in Planck's Law discussed in this post. 

Summary: It seems possible to capture atomic radiation by RealQM within a classical continuum mechanics framework and so avoid taking the step out of classical physics along the dream of Schrödinger. In particular, superposition is not required and probably not present. Quantum computers built on superposition will not work. Superposition may be superstition rather than reality.