onsdag 5 november 2025

Quantum Computing as Theoretical Fantasy

Here is a summary by chatGPT of Dyakonov's view on the realism of quantum computing:

Physicist Mikhail Dyakonov has been one of the most prominent skeptics of large-scale quantum computation. His main critique can be summarized as follows as of 2019:

  1. Extreme Fragility of Quantum States – Quantum bits (qubits) are extraordinarily sensitive to any external disturbance. Maintaining coherent quantum states long enough for complex computations is, in his view, practically impossible.

  2. Error Correction Limits – While quantum error correction is theoretically possible, Dyakonov argues that the real-world requirements (millions of physical qubits per logical qubit, perfectly calibrated control systems, etc.) make it infeasible.

  3. Unrealistic Scaling – He believes scaling from a few dozen qubits to millions (as required for useful computation) involves an exponential explosion of complexity in control, synchronization, and noise management.

  4. Overly Optimistic Theory vs. Practice – Dyakonov claims that most of quantum computing theory assumes idealized conditions that cannot be realized physically. The engineering challenges are, in his words, “beyond human capabilities.”

  5. Skepticism Toward “Quantum Supremacy” Claims – He views demonstrations like Google’s 2019 experiment as contrived and not indicative of practical progress toward general-purpose quantum computing.

In essence, Dyakonov argues that large-scale quantum computers are a theoretical fantasy, comparable to perpetual motion machines — possible on paper, but not in reality.

end chatGPT

Let us compare with the latest hype:

  • Google’s Willow Chip Achieves Historic Quantum Computing Breakthrough
  • Willow Chip Powers Verifiable Quantum Advantage, Running Algorithm 13,000x Faster Than Supercomputers.
This what chatGPT says about the Willow Chip:
  • Dyakonov’s scepticism remains valuable and largely justified — many of his fundamental concerns (noise, control, scale, assumption-validity) are not yet fully overcome.
  • The Willow chip is important: it represents a tangible, hardware-driven step toward the goals of large-scale quantum computing. It shows we are making progress on the error-correction and scaling front.
  • However, Willow does not yet fully invalidate Dyakonov’s concerns. It addresses some, but many of the major leaps (millions of qubits, full fault-tolerance, broad practical algorithmic supremacy) are still ahead.
  • In other words: Willow moves the bar, but the “gap” that Dyakonov insisted on remains significant. The achievement doesn’t guarantee the remaining engineering challenges will vanish.
  • If I were to pick one key takeaway: Willow demonstrates that at least some of the previously theoretical obstacles (error-correction scaling) can be approached in hardware — which means the sceptical “maybe impossible” thesis is being challenged. 
  • But “practically useful, general-purpose quantum computers in the near-term” still remain uncertain.
It seems that quantum computing is still a theoretical fantasy, 50 years after the idea was launched. 

Recall that quantum computing is based on unitary evolution of quantum systems of thousands of qubits in superposition of possibilities as fantasy physics: Compare with the first proposition in Wittgenstein's Tractatus:
  • The World is all that is the case.
It is clear that "to be the case" requires more than what is present in a quantum system of possibilities, which means that according to Wittgenstein a quantum computer does not belong to the World. But a quantum computer is an analog computer and as such must belong to the World. Wittgenstein would thus view the Willow chip with utter skepticism. And you?

Recall that the idea of a quantum computer is a model of an uncontrollable real/analog quantum system as part of the World in the form of a controllable real/analog quantum system as part of the World, with the caveat that the model is not "the case" because it plays with possibilities and not with realities.   

Notice that this contradiction does no appear with a digital computer because the computing is abstract mathematical and so does not need real analog computing.  


Inga kommentarer:

Skicka en kommentar