fredag 7 november 2025

Sweden Declares War with Russia

Prime ministers Kristersson/Sweden and Zelensky/Ukraine have agreed on delivery of 150 Swedish JAS Gripen fighter jets to Ukraine with possible financing from frozen Russian assets:

  • Zelensky: These are very cool planes, powerful aviation platforms that allow for performing a wide range of tasks.
  • Kristersson: This will strengthen both Ukraine, Sweden and Europe.
The deal is bilateral between Sweden and Ukraine and means that Sweden effectively declares war with Russia even if not yet officially. Russia has not yet reacted.

Gustaf IV Adolf King of Sweden lost the Finnish War 1808-1809 against the Russian Empire and also his crown and so was deported to exile in Switzerland where he lived in a small hotel in great loneliness and indigence under the name of Colonel Gustafsson.

This meant the end Sweden as European Superpower and the politics shifted from war to peace for 200 years, but the dream of a revanche vs Russia has been kept alive. Sweden is today the NATO country with most aggressive standpoint vs Russia backed by 150 JAS Gripen payed with Russian money.

In this situation it is necessary to answer some questions:
  • Are the prospects of winning over Russia better today than in 1808? 
  • Will Russia invade Sweden if Sweden does not send 150 JAS to a NATO proxywar in Ukraine against Russia?  
  • How will Russia respond to 150 Swedish JAS flying against Russian border? 
  • What is the military power of Russia?
  • Is Ukraine a member of the defence organisation NATO?
Swedish Parliament has now delegated the decision to Prime Minister Kristersson to send Swedish soldiers to any action outside Sweden deemed necessary to protect Sweden from invasion by Russia. 

There is one member in the Swedish Parliament who supports peaceful co-existence with Russia instead of a war which cannot be won: Elsa Widding without party. 

All political parties speaks about war, while the the moral elite of Sweden once a self-proclaimed humanitarian Superpower is silent including peace/church organisations. Possible to understand?

Compare with earlier analysis:



Church-Turing vs Quantum Computing Illusion

A natural system like the weather can be viewed to perform a form of analog computation as it evolves from one time instant to the next when molecules in the air interact with their neighbors. The computational complexity can be viewed to be polynomial in the size of the physical system. This is expressed in the Physical Church-Turing Thesis PCTT:

  • Any physical process can be simulated by a Turing machine.  
Here a Turing machine is a model of a digital computer with polynomial computational capacity capable of simulating a physical process of polynomial computational complexity.

According to PCTT there is thus no physical process expressing exponential complexity, which would be beyond the capacity of digital computing. 

Quantum computing is a form of analog computation with promise of exponential capacity capable of meeting the needs of systems of exponential complexity. It is motivated by a view that quantum mechanics carries exponential complexity in the form of a multi-dimensional wave function and so cannot be simulated on a digital computer. 

We meet here a contradiction:  
  • An analog quantum computer is realised in a physical process which according to PCTT is limited to polynomial complexity and so does not have exponential capacity.
We see that PCTT says that a quantum computer with exponential capacity cannot be constructed. No wonder that no such quantum computer has been constructed.

If PCTT is correct, it means that the evolution of a quantum system of atoms and molecules as a physical process, does not expresses polynomial complexity and so in principle can be simulated by digital computation with polynomial capacity. The multi-dimensionality of the wave function appearing to demand exponential capacity thus is an illusion. 

RealQM deconstructs the illusion, by offering simulation of systems of atoms and molecules by digital computation (of polynomial complexity).

PS1 Recall that an N-body simulation has a computational complexity between $N$ and $N^2$ depending on interaction between bodies. 

PS2 If macroscopics has polynomial complexity, then so has microscopics as the basis of macroscopics. If microscopics has exponential complexity, then so has macroscopics based on microscopics. 


torsdag 6 november 2025

The Black Hole of Quantum Computing

Standard Quantum Mechanics StdQM based on a multi-dimensional Schrödinger Equation SE is viewed to have exponential digital computational complexity effectively making SE uncomputable even on super-computers and thus useless for digital modeling of atoms/molecules in practice.

Quantum computing is an attempt to model atoms and molecules instead by analog computation performed on quantum computers capable of meeting exponential complexity. A quantum computer models a physical quantum system of atoms/molecules not by SE, but by another (simpler) physical quantum system which is controllable as being described by SE. 

A quantum computer thus offers a model/map of a real system which is itself a real system of the same form and type. The quantum computer model can be viewed as a 1:10 exact physical model of an airplane or ship with some details removed. The digital computational complexity of the original physical system is then irrelevant since no digital computation is performed, only a form of analog computation performed in a physical model of essentially the same computational complexity. 

Quantum computing thus represents a step back from the scientific revolution based on Calculus offering  mathematical models of reality of abstract symbolic from based on numbers which can be made alive by digital computing. 

Quantum computing throws away digital Calculus because it is found to be useless, and seeks a replacement which works in terms of analog computation. 

But is it possible that real quantum systems really perform analog computations of exponential complexity? If a digital computer does not have the capacity to meet exponential complexity, what says that a real system as it evolves somehow can express exponential complexity? 

We know that the exponential complexity of SE comes from the multi-dimensionality of the wave function representing possibilities rather than actualities. But real systems do not realise all possibilities only a few actualities, and so the computational complexity of SE as model is much bigger than the complexity of what is modeled. The model/picture totally swamps the origin and so asks for infinite digital computational power to no avail as concerns the origin.

RealQM is a version of the quantum mechanics of atoms and molecules, which is digitally computable and so does not need quantum computing.


onsdag 5 november 2025

Quantum Computing as Theoretical Fantasy

Here is a summary by chatGPT of Dyakonov's view on the realism of quantum computing:

Physicist Mikhail Dyakonov has been one of the most prominent skeptics of large-scale quantum computation. His main critique can be summarized as follows as of 2019:

  1. Extreme Fragility of Quantum States – Quantum bits (qubits) are extraordinarily sensitive to any external disturbance. Maintaining coherent quantum states long enough for complex computations is, in his view, practically impossible.

  2. Error Correction Limits – While quantum error correction is theoretically possible, Dyakonov argues that the real-world requirements (millions of physical qubits per logical qubit, perfectly calibrated control systems, etc.) make it infeasible.

  3. Unrealistic Scaling – He believes scaling from a few dozen qubits to millions (as required for useful computation) involves an exponential explosion of complexity in control, synchronization, and noise management.

  4. Overly Optimistic Theory vs. Practice – Dyakonov claims that most of quantum computing theory assumes idealized conditions that cannot be realized physically. The engineering challenges are, in his words, “beyond human capabilities.”

  5. Skepticism Toward “Quantum Supremacy” Claims – He views demonstrations like Google’s 2019 experiment as contrived and not indicative of practical progress toward general-purpose quantum computing.

In essence, Dyakonov argues that large-scale quantum computers are a theoretical fantasy, comparable to perpetual motion machines — possible on paper, but not in reality.

end chatGPT

Let us compare with the latest hype:

  • Google’s Willow Chip Achieves Historic Quantum Computing Breakthrough
  • Willow Chip Powers Verifiable Quantum Advantage, Running Algorithm 13,000x Faster Than Supercomputers.
This what chatGPT says about the Willow Chip:
  • Dyakonov’s scepticism remains valuable and largely justified — many of his fundamental concerns (noise, control, scale, assumption-validity) are not yet fully overcome.
  • The Willow chip is important: it represents a tangible, hardware-driven step toward the goals of large-scale quantum computing. It shows we are making progress on the error-correction and scaling front.
  • However, Willow does not yet fully invalidate Dyakonov’s concerns. It addresses some, but many of the major leaps (millions of qubits, full fault-tolerance, broad practical algorithmic supremacy) are still ahead.
  • In other words: Willow moves the bar, but the “gap” that Dyakonov insisted on remains significant. The achievement doesn’t guarantee the remaining engineering challenges will vanish.
  • If I were to pick one key takeaway: Willow demonstrates that at least some of the previously theoretical obstacles (error-correction scaling) can be approached in hardware — which means the sceptical “maybe impossible” thesis is being challenged. 
  • But “practically useful, general-purpose quantum computers in the near-term” still remain uncertain.
It seems that quantum computing is still a theoretical fantasy, 50 years after the idea was launched. 

Recall that quantum computing is based on unitary evolution of quantum systems of thousands of qubits in superposition of possibilities as fantasy physics: Compare with the first proposition in Wittgenstein's Tractatus:
  • The World is all that is the case.
It is clear that "to be the case" requires more than what is present in a quantum system of possibilities, which means that according to Wittgenstein a quantum computer does not belong to the World. But a quantum computer is an analog computer and as such must belong to the World. Wittgenstein would thus view the Willow chip with utter skepticism. And you?

Recall that the idea of a quantum computer is a model of an uncontrollable real/analog quantum system as part of the World in the form of a controllable real/analog quantum system as part of the World, with the caveat that the model is not "the case" because it plays with possibilities and not with realities.   

Notice that this contradiction does no appear with a digital computer because the computing is abstract mathematical and so does not need real analog computing.  


måndag 3 november 2025

Hydrogen Spectrum as Classical Physics

This is a continuation of the previous post on the necessity to give up classical physics for quantum physics in its text book form of Standard Quantum Mechanics StdQM. We ask to what extent RealQM as a form of classical physics can explain the observed spectrum of the Hydrogen atom as expressed in stimulated radiation. We thus will compare

  • StdQM: Spectral line appears from superposition of wave functions of eigenstates. 
  • RealQM: Spectral line appears from oscillation between charge distributions of eigenstates.
In both cases the frequency of the spectral line scales with the difference of energy between eigenstates, but with different explanations: 
  • StdQM: Spectral frequency appears as beat frequency of wave functions with eigenfrequency variation in time according to the Schrödinger's equation. Connection between frequency and energy is secondary. Radiation can appear to be spontaneous.
  • RealQM: Spectral frequency is not beat frequency, but simply the frequency $\nu=\frac{\Delta E}{h}$ which matches the energy difference $\Delta E$ between eigenstates with $h$ Planck's constant in an assumed coupling between frequency and energy. There is an active exchange of energy between atom and radiation field with frequency matching the jump in energy. The atom is forced to respond to radiation of certain frequency as a dipole. The radiation is not spontaneous. 
We see that StdQM offers an explanation in terms of time-dependent quantum mechanics without realism, while RealQM relies on the formal coupling between matter and radiation expressed by $E=h\nu$ appearing in blackbody radiation. Compare with this post on the physical meaning of $E=h\nu$.

We see that, at least in the case of stimulated radiation, the spectrum of an atom can be given a RealQM semi-classical explanation. It is not clear that StdQM offers something more enlightening. Or?

This discussion connects to quantum computing discussed in recent posts with StdQM supposed to support delicate superposition of wave functions free of forcing performing complex analog computations, while RealQM brings forward the aspect of forcing in terms of classical physics. 

PS Here is a chatGPT comment.

The Double-Slit Experiment as Classical Physics

The double-slit experiment with one single photon at a time is the sledge hammer which is designed to kill classical physics and so open to a new form of physics named quantum mechanics as the essence of modern physics.

The experiment consists of sending a sequence of photons one at a time through a double slit in a thin wall and recording the emergence of a macroscopic interference (fringe) pattern on a screen behind assuming a new dot for each new photon. 

The pattern is the same as the interference patterns generated by a classical macroscopic wave hitting the wall and generating two macroscopic waves from the slits which interact constructively and destructively and so form a macroscopic fringe pattern on the screen.

We thus have a quantum version of the experiment with one photon at a time and a classical version, both building the same macroscopic fringe pattern on the screen. 

The catch of the quantum explanation of the quantum version is that the single photon after somehow passing through the double slit with a left and right slit, somehow appears after the slit in a superposition of "left pass" and "right pass" as two mutually exclusive possibilities carrying an interference pattern of possibilities, which is made real dot by dot on the screen (by "collapse of the wave function").

The sledge hammer argument is now that the quantum version cannot be given a classical explanation because a classical particle/phtoton has to make the choice of "left pass" or "righ pass" which will remain after passing and so does not form any interference pattern, just two one-slit patterns. 

The strategy is thus to promote quantum physics by an experiment arranged to show deficiency of classical physics. 

Let us take a closer look at the quantum version asking what the physical meaning of "one photon at a time" may be. What is the evidence that only one photon at a time reaches the double slit? How to produce exactly one photon? 

It seems that the evidence of "one photon at a time" is that there is "one dot at a time appearing on the screen". The argument is thus that one dot is generated by exactly one photon. Is this really so sure?

What if many photons are in fact involved with the capacity to pass both slits and form a classical real but very weak interference pattern before hitting the screen with a random excitation of one dot. The calibration of the intensity of the source would then be adjusted to produce one dot at a time, which de facto involves many photons capable of generating an intereference pattern with random realistion.

It thus may be possible to explain a "one dot at a time" double slit experiment in terms of classical wave physics. The sledge hammer argument would then vaporise.  

PS After a discussion with chatGPT we agree that the one-photon experiment is extreme, but we seem to disagree about the value of extreme experiments to guide development of non-extreme physics. Why not stick to non-extreme experiments for non-extreme physics? What can be learn about normality by pushing into non-normality? What to learn about normal life from extreme violence? chatGPT seems to believe in the value of extremism in physics more than I. And you?      

söndag 2 november 2025

Why Is Analog Quantum Computing Needed?

Quantum computing is motivated by a perception that simulating atomic physics described mathematically by the Schrödinger Equation SE of Quantum Mechanics QM, is exponentially hard and so is impossible. This is because SE for a system with $N$ electrons involves $3N$ spatial dimensions with computational work increasing exponentially with $N$.

In other words, digital simulation of QM is viewed to be so computationally demanding that the alternative of analog simulation must be explored. This is the idea analog quantum computing launched by Richard Feynman 50 years ago:

  • Simulate a real quantum system by a controllable laboratory quantum system. 
This is the same idea as testing a physical model of a real airplane in a wind tunnel under controllable conditions. Or building a toy model of a bridge and testing its bearing capacity. No mathematics is needed, just craftsman skill.

The basic idea is thus to give up building mathematical models of realities in terms of Cartesian geometry based on numbers with digital representation, as the scientific method behind the evolution of the modern industrial/digital society. 

Such a step can be seen as a step back to a more primitive science based on analog modeling without mathematics. 

In any case, massive investment is now going into creating quantum computers as controllable analog quantum systems. The design work has to cope with the perceived impossibility to test different designs using mathematical digital modeling, and so has to rely on tricky experimental testing. The time frame for a useful analog quantum computer appears to be decades rather than years.

With this perspective it is natural to ask if the exponential computational complexity of the microscopics of quantum mechanics is written in stone. Macroscopics of continuum mechanics rarely comes with exponential complexity, because evolving a macroscopic system, like the weather, one time step involves only local connections in 3 space dimensions which has polynomial complexity. 

If macroscopics has polynomial complexity, then microscopics on smaller scales should have as well. RealQM offers a version of quantum mechanics of polynomial complexity. If nothing else, it can be used to test different designs of an analog quantum computer. Want to try RealQM?

Another mission of analog quantum computing put forward to motivate investors, is improved potential of factorisation of large natural numbers with promise to break cryptography codes. But analog computation about properties of numbers instead of digital appears far-fetched.

PS Recall that at each clock cycle
  • a digital computer operates on $n$ factual states
  • a quantum computer operates on $2^n$ possible states  
with simplistic promise of an enormous increase of capacity from linear to exponential. Is it too good to be true?

lördag 1 november 2025

Quantum Computing Without Mathematics

Schrödinger's Equation SE for the Hydrogen atom with one electron formulated in 1926 by the Austrian physicist Erwin Schrödinger as a model of a negative electron charge density subject to Coulomb attraction from a positive kernel,  was generalised to atoms with many electrons by a formal mathematical procedure adding a new independent 3d spatial Euclidean space for each electron into a linear multi-dimensional SE with $3N$ spatial dimensions for an atom with $N$ electrons, to form the foundation of the modern physics of Quantum Mechanics. 

The mathematics of the multi-d SE was quickly formalised by the mathematician von Neumann into highly abstract functional analysis in Hilbert spaces as a triumph of symbolic abstract mathematics. Physics of real atoms was thus hijacked by mathematicians, but the task of making physical sense of the abstraction was left to physicists without the mathematical training required to make real sense of von Neumann's functional analysis. The problem of physical interpretation remains unresolved today, which is behind the present manifest crisis of a modern quantum physics hijacked by mathematics.

The multi-d SE showed to harbour a serious problem when confronted with mathematical computation. Because of the many dimensions the computational complexity showed to be exponential, which made SE uncomputable on digital computers, and so in effect useless. 

Abstract mathematics had created a model of real physics, which showed to be an uncomputable monster, which was not useful except as an exercise of functional analysis.

Quantum computing is a new form of computing fundamentally different from digital computing as mathematical computing with numbers. The idea was launched in the 1970s by the physicist Richard Feynman as a new approach to tackle the uncomputability of QM. The radical idea was to replace uncomputable functional analysis by a form of analog quantum computation, where a real atomic quantum systems is modeled in a laboratory by another real quantum system acting as analog quantum computer.

Recall that the option of replacing a mathematical model by an analog model is also used classically, when a model of an airplane is studied in a wind tunnel instead of solving the Navier-Stokes equations deemed to be impossible.

The success of mathematisation of quantum physics into functional analysis in Hilbert spaces hundred years ago carried its own destruction by coming with exponential complexity, which could not be met within mathematical computation. 

Heavy investment in is now being directed into building a quantum computer supposed to function according to a mathematical formalism, which is being replaced by analog quantum computing.

Does this appear to be strange? To build on mathematical quantum mechanics which is replaced by analog quantum computing based on what was replaced? What would Wittgenstein say about something like that? In any case the investment in quantum computing is high risk. 

RealQM offers a way out of this mess, in the form of a different Schrödinger equation which is computable as digital mathematics and thus does not have to be replaced by analog quantum computing.  Why not give it a try?

 

måndag 27 oktober 2025

Is Quantum Computing Possible?

Quantum superposition is the crucial component of Quantum Mechanics believed to open a new world of quantum computing on quantum computers.    

An $n$-qubit quantum computer with $n=2,3,..,$ is supposed to operate in parallel on $2^n$ states in superposition, to be compared with a classical digital computer operating on $n$ bits at a time, where a bit can take the value 0 or 1. Quantum computing thus gives promise of exponential speed-up vs digital computing.   

The idea of quantum computing was first presented by Richard Feynman in an attempt to get around the apparent exponential complexity of Quantum Mechanics QM based on Schrödinger's multi-dimensional wave equation making digital simulation impossible by demanding computational work scaling with $2^N$ for a system with $N$ electrons. The idea was to replace digital simulation with some form of analog quantum computation where the simulation of a quantum system would be performed on a quantum system. An intriguing idea, but could it work? The apparent exponential complexity would then be met with an exponential computational power making simulation possible. To meet a perceived difficulty by ramping up the capacity or simply to "fight fire with fire". 

Let us then consider the basic idea of a quantum computing, which is 

  • Simultaneous operation on states in superposition.       

What then is "superposition"? The mathematical answer is the following: Consider a linear algebraic or differential equation with solutions $\psi_1$ and $\psi_2$. Then the algebraic sum $\psi =\psi_1+\psi_2$ is also a solution and $\psi$ is viewed to be the superposition $\psi_1$ and $\psi_2$ with the + sign signifying algebraic sum. 

As concerns classical physical wave mechanics, the algebraic superposition can take two forms with physicality of (i) the algebraic sum $\psi$ or (ii) of the individual terms $\psi_1$ and $\psi_2$. Case (i) represents the interference pattern seen of the surface of one pond, while (ii) represents the beat interference generated by two vibrating strings with nearby frequencies. 

As concerns QM the physicality is supposed to be displayed in the double-slit experiment: Let a single photon/electron pass a double-slit and then be detected on a screen behind a as a spot. Repeat the experiment many times and notice a fringe pattern appearing on the screen, which is the same interference pattern developed by a macroscopic wave passing through both slits. The photon/electron after passing the double split is representing the quantum superposition supposed to carry quantum computing. This is not a superposition of realities as in (ii) above, but a superposition of possibilities made real by repetition of the one photon/electron experiment which represents the physics of a 1-qubit: The single photon/electron is by passing through the double slits put into a superposition of passing the left slit and passing the right slit not as realities but as possibilities. It is here essential that the superposition only concerns one photon/electron in superposition of $2^1=2$ states. This is supposed to generalise to $n$ entangled photons/electrons in superposition of $2^n$ possible states.

The evidence that quantum computing is possible thus boils down to the double-slit experiment. If one photon/electron can be put in superposition of two states which appears to support interference with fringe pattern, then constructing of a $n$-qubit computer may be possible. If two photons/electrons are needed for two states then we are back to classical computing with bits.

The crucial question is now: Is it sure that because there is one click on the screen at a time, the input is one photon/electron at a time?

Think of this, and return to see a more precise analysis.

QM in its standard multi-dimensional form has exponential complexity, which requires exponential computing power. RealQM is an alternative with polynomial complexity which can be met by classical computing.  

Maybe quantum computing is neither possible nor needed? 


fredag 24 oktober 2025

Quantum Physics as HyperReality = Crisis

Modern physics as Quantum Mechanics QM is based on a (complex-valued) wave function $\Psi (X)$ depending on a $3N$-dimensional configuration space coordinate $X$ for an atomic system with $N$ electrons giving each electron a separate 3d Euclidean space. 

Configuration space is a mathematical construct which does not have any physical representation for $N>1$. The wave function as a function defined on configuration space shares the same lack of physical representation. The wave function evolves in time as a mathematical construct satisfying a Schrödinger equation. 

To give QM a meaning beyond a mathematical game it is necessary to give the wave function a physical meaning, which has drawn much attention without any common view ever being formed. The credibility of modern physics has suffered from this failure and can be seen as the main reason for its present commonly witnessed crisis, which can be summarised as follows: 

  • classical physics = mathematical model with physical origin.   
  • quantum physics = mathematical model without physical origin.     

To handle the lack of physical origin in quantum physics, the logic has been twisted by turning the mathematical model into a new form of reality, which then perfectly fits the model. This connects to the French philosopher Baudrillard's concept of hyperreality as imagined reality formed from a model without origin, when the model is turned into real. 

That this is what de facto happened in quantum physics, is evidenced by the fact that quantum model predictions always perfectly match (imagined) reality.  

We can summarise (see also this post and this post):

  • classical physics: reality is turned into mathematical model
  • quantum physics: mathematical model is turned into hyperreality.
There is a further complication coming from the lack of physical origin/representation of the wave function, namely the conceptual understanding of a mathematical model without origin in some physics connecting to our experience as human beings. In short, how can we capture in our minds a wave-function defined over a $3N$-dimensional configuration space, when our experience is 3d?

The modern quantum physicist thus has to struggle with an imagined reality represented by a mathematical model without real origin as support for imagination. How to imagine a reality when the imagination has nothing relevant to feed from?

There is a way out of this dilemma: Start with RealQM as a quantum model with real origin. 

Recall that we can understand forms of virtual reality without origin but only so far the virtual reality is based on concepts of reality. If the virtual reality displays 6 space dimensions, then we will not be able to properly understand (only pretend to).

Baudrillard uses Disney World as a form of hyperreality as a model of an American society which has never existed, thus without true origin, but yet formed in familiar terms we can understand.