Here is a new article to the series about RealQM as an alternative to StdQM:
Me: Tell me about the work on stability of matter by Dyson and Lenard.
towards understanding by critical constructive inquiry
Here is a new article to the series about RealQM as an alternative to StdQM:
Here is an interesting video about the possible origin of the Cosmic Microwave Background Radiation CMBR other than an "afterglow" of a Big Bang proving the Big Bang hypothesis to be correct, as the currently most popular cosmological theory:
The video recalls early theories about a static Universe filled with some form of interstellar dust radiating a Planck spectrum of around 2-5 Kelvin. Such a thing was detected in 1967 at 2.7 K and was then connected to Big Bang deleting the old theories from the map. In particular the Big Bang theory claimed to resolve the mystery of the observed red-shift of galaxies increasing with distance suggesting an accelerated expansion of the Universe.
One of the old theories was the "tired light" hypothesis presented by Zwicki in 1929 suggesting that the red-shift could be the result of a loss of energy of light passing through interstellar dust over long distance with energy scaling with frequency.
The idea of an active interstellar dust actively radiating at 2.7 K seems more natural than that of a Big Bang "afterglow" still hanging on after 13.8 billion years.
Zwicki's "tired light" hypothesis can be connected to some form of dark matter as source of gravitation which shows up as CMBR resulting from sucking up energy from passing light. I have discussed dark matter in posts on Neo-Newtonian Gravitation.
The mathematical models of physics take the form of partial differential equations like Euler's Equations for incompressible inviscid fluid flow EE, corresponding Navier-Stokes equation for viscous flow NSE and Schrödinger's equations for atoms and molecules SE.
The task of a theoretical mathematician has been to prove by symbolic analytical techniques (i) existence, (ii) uniqueness and (ii) regularity of solutions to a given equation with data given in some large class of possible data with data including initial data, forcing and parameters like viscosity in NSE.
The task of a computational mathematician has been to compute solutions for specific choices of data which in each specific case can answer (i)-(iii) by inspection of the computed solution.
It has been argued that computation is not enough, even if for each specific choice of data (i)-(iii) can be answered, because only a limited number of specific choices can be inspected. The possibly very large class of data can thus never be exhausted by computation, which gives analytical symbolic mathematics a role to play by covering a large class of data.
It is natural to ask if there are examples of equations for which the class of relevant data is so small that it can be exhausted by computation. This means first that the equation cannot contain any parameter like viscosity. Are there any models of interest which are parameter free? Inspection of EE and SE shows that they are both parameter free, and so meet the requirement of Einstein of an ideal mathematical model opening to say something about the world without measuring anything. This is like learning the area of a circular disc by computation with unit radius as only input.
Solving EE computationally thus delivers the drag of a body moving through a slightly viscous fluid such as air and water at a subsonic speed with the only data being the shape of the body and not any viscosity as parameter. This limits the class of data to shapes of bodies with a limited range of shapes of interest to be covered by computation. This is all described here.
The case of SE is in its traditional form of Standard Quantum Mechanics StdQM troubled by the fact SE by its multi-dimension nature is uncomputable and so needs dimensional compression which introduces parameters.
RealQM is different realisation of the same parameter-free Hamiltonian as StdQM into computable form without introduction of any parameter. RealQM thus expresses SE in parameter-free computable form and so opens the possibility of saying something about the atomic world without experimental input. RealQM thus computes the ground state of an atom with the only input being the number of electrons and so can exhaust the Periodic Table.
An analytical estimate of ground state energy as the result of a longer or shorter sequence of successive bounds, can be seen as a form of symbolic computation, while a numerical computation can be seen as very long arithmetic proof.
Computation with a parameter-free mathematical model can produce a rich set of outputs from very limited structural input, which can serve as data for AI in need of rich data. Computation is then used both to produce data and to learn from data. Symbolic mathematics has an important role to set up computation.
The Clay Institute Millennium Problem on (i)-(iii) for NSE is still open in the form of symbolic mathematics with no progress reported over 25 years. Can computation get the million dollar Prize?
Here is the next article in a sequence of articles exhibiting the capabilities of RealQM as an alternative to StdQM as the canon of modern physics:
Here is another article in a series of articles about RealQM to be submitted:
I am now now preparing to submit a sequence of articles about Real Quantum Mechanics to relevant journals and here is a first test to check out reaction:
The shift from the old quantum mechanics of Niels Bohr happened with Werner Heisenberg's July 1925 "Reinterpretation" article introducing a new form of matrix mechanics without Bohr's electron trajectories to form the new quantum mechanics.
Heisenberg was a student of Max Born at the University of Göttingen with the mathematician Hilbert as world authority of mathematical physic acting as Born's mentor. Göttingen mathematical physics was an important part of the "Weimar Renaissance" during the recovery of Germany after the defeat in WWI getting momentum in 1925.
It is thus possible to give Heisenberg's new quantum mechanics political dimensions stretching into WWII with Heisenberg as leading scientist in Germany's quest for an atomic bomb in the "Uranium Club".
Planck had a similar role at the height of the German Empire when he in 1900 took on the responsibility to resolve the outstanding open physics problem of black-body radiation, when to avoid failing he resorted to statistics of quanta which resurfaced in Born's interpretation of Heisenbergs new quantum mechanics.
Heisenberg worked as Bohr's assistant in 1925 and Bohr invited Born to Copenhagen in 1926, and so they came to form the Bohr-Born-Heisenberg Copenhagen school setting the agenda for quantum mechanics from its beginning into our time.
Schrödinger entered in 1926 with his equation showing to be equivalent to matrix-mechanics, but Schrödinger did not accept the Copenhagen Interpretation and so left the field to come back only in 1954 to discover that BBH still controlled the scene.
After WWII the US took over quantum mechanics still in the spirit of BBH.
All physicists of today say that following the idea of the mathematician von Neumann that the wave function of quantum mechanics lives in a Hilbert space of infinite dimension and most physicists will confess to the Copenhagen Interpretation even if its meaning is unclear. Schrödinger's request of physicality is met with "shut up and calculate".
What do the Chinese say? Is it time for Schrödinger to come back in the form of RealQM in the "China Renaissance" that is now reshaping the world? It is not impossible since the Chinese are very clever, very organised and result oriented towards a clear plan.
ChatGPT: After completing his doctoral work in Budapest and Zurich, von Neumann spent the academic year 1926–27 at Göttingen. There he
Studied under David Hilbert, attending Hilbert’s lectures on the mathematical foundations of quantum mechanics,
Sat in on Max Born’s seminar on the new quantum theory, and
Published his first quantum‐mechanical notes out of Göttingen later that year.
There are basic unresolved foundational "issues" with Standard Quantum Mechanics StdQM based on Schrödinger's Equation SE, basically issues with SE:
The Schrödinger equation describes the ground state of the Hydrogen atom by the wave function $\Psi (x)$ with $x$ a 3d spatial variable, which minimises the total energy
Schrödinger's equation SE in its linear multi-dimensional form with wave function solution $\Psi (x)$ depending on a $3N$-dimensional spatial variable $x$ for a system with $N$ electrons is viewed to be the foundation of quantum mechanics of atoms and molecules as Standard Quantum Mechanics StdQM.
Without SE physics would be thrown back to 1924 with only Bohr's model of the atom in a form of classical physics. There are no physicists of today that advocate that this makes any sense.
But SE has been subject to deep dispute since its formulation by Schrödinger in 1926 followed by Born's suggestion of giving the wave function $\Psi (x)$ a meaning by saying that $\vert\Psi (x)\vert^2$ represents the probability of an electron configuration described by $x$. This means that $\Psi (x)$ is given an epistemic meaning as what "a physicist can say", and not an ontic meaning as physics independent of what a physicist may have to say, as expressed by Bohr.
But Bohr's view was not satisfactory to many physicists who wanted to find an ontic meaning of $\Psi$ independent of what people may have to say, but nothing really convincing was ever found. The $3N$-dimensionality of $\Psi (x)$ defied real physical meaning and then the only option was an epistemic statistical meaning.
This means that the basic foundational problem of SE has never been resolved despite intense debate over 100 years with no consensus in reach, except an agreement that after all the physical meaning of $\Psi (x)$ does not matter much, since it has shown to always deliver predictions in full agreement with observations, but then predictions with unclear physical meaning of course.
But there is an alternative to StdQM which could have been formulated in 1926, but for some reason was missed, see this post. This is RealQM as a different form of Schrödinger's equation as a non-linear system of one-electron wave functions $\psi_i(x)$ for $i=1,...N,$ with non-overlapping supports depending on a common 3d space variable $x$ with $\vert\psi_i(x)\vert^2$ as physical charge density with direct ontic meaning.
Connecting to the previous post, RealQM can be seen as a Platonic/mathematical generalisation from $N=1$ to $N>1$ where the physicality for the Hydrogen atom with $N=1$ is kept. On the other hand, StdQM can be seen as an Aristotelian/pseudo-mathematical generalisation where physicality is lost.
When I present RealQM as an alternative to StdQM to quantum physicists and chemists I meet little understanding indicating that the pseudo-mathematization of StdQM has a very strong grip of minds. Nor do mathematicians show interest because StdQM already in 1933 by the mathematician von Neumann was translated into an impressive abstract world of Hilbert spaces occupied by multi-dimensional wave functions subject to Hermitian operators satisfying axioms without physics.
But there is a good chance the situation can change in 2026 since RealQM is continuing to deliver new results in accordance with observations.
PS Bohr claimed that the purpose of quantum mechanics is to make predictions of experiments as testable events, not to compute e g the energies of ground state or excited state of an atom regardless of actual experiment being made. This opened to a confusion between deterministic computation of ground/excited state and probabilistic outcome of an experiment suggesting that the model behind the computation itself is of probabilistic nature although in fact fully deterministic. There is nothing probabilistic with the spectrum of an atom as difference between energies of ground/excited states, nor in computation nor in experiment except those from external inputs.
This is inspired by a very informative article by Matthew Ehret: Unravelling The Jesuit Enigma.
The Scientific Revolution is viewed as a triumph of Platonic mathematization over Aristotelian natural philosophy expressed in classical Newtonian mechanics based on the Calculus of Leibniz and Newton.
But in modern physics based on Quantum Mechanics QM, the roles appear to have shifted.
Classical mechanics as macroscopic physics with causality is based on clear physical principles and logical argumentation, and is understandable in theory and very useful in practice. Together with the computer, classical mechanics is a formidable tool and machine.
QM as microscopic physics without causality is based on evasive principles, lacks logic and is understood only by its high priests of Nobel Laureates in Physics, to be used by all others under the command "Shut up and calculate" as the foundation of modern information society for all to accept. In this respect it connects to the Jesuit Priest Ignatius Loyola's view on education as expressed by Bertrand Russell:
But there cannot be a dichotomy between macro and micro, it must all be Platonic. That is the idea of RealQM.
An important part 20th century mathematics has been devoted to analysis of partial differential equations PDEs as concerns (i) existence and (ii) regularity of solutions. A PDE is a continuum model with infinitely many degrees of freedom.
Proofs of existence typically start from some a priori bounds on solutions to regularised equations with existence of solutions settled and then obtain solutions of the original equation through a limit process.
The main components of an existence proof are the a priori bounds, which can require complicated and lengthy mathematical analysis.
Once existence of solutions is proved, further mathematical analysis can prove properties of solutions typically as bounds on derivates showing regularity. Again the analysis can be complicated and lengthy.
A famous challenge in the form of a Clay Millennium Prize Problem is to give an analytical proof of existence and regularity of solutions to the Navier-Stokes equations for incompressible fluid flow. No progress on this open problem has been reported since 2000.
But there is a different approach to (i) and (ii) in terms of computation where in each given case a an approximate solution to the equations is computed in a step by step manner after discretisation of the PDE into a finite number of degrees of freedom which can be processed by numerical linear algebra. The computational process either halts or delivers after a finite number of steps of choice an approximate solution, which can thus be inspected a posteriori as to qualities. It is thus possible to evaluate in what sense the approximate solution satisfies the PDE and accept or recompute with better discretisation.
We can thus meet a fundamental difference:
With the help of chatGPT I have learned about the heroic work by Dyson-Lenard and Lieb-Thirring to mathematically prove SM within Standard Quantum Mechanics StdQM and Density Functional Theory DFT, which boils down to very intricate book-keeping to prevent collapse of potential energy to minus infinity by local accumulation of electron charge densities. The main difficulty to handle is the overlap in StdQM/DFT of electron wave functions with global support. The proof is lengthy and complicated and not easy to follow. It is not part of text books/courses in QM, even if completely fundamental.
It is natural to ask how it can be so difficult to prove SM within StdQM/DFT, when SM is such a basic property of the physics modled by StdQM? Does real physics also have to handle intricate bookkeeping to avoid collapse?
Or is the proof difficulty of SM within StdQM/DFT yet another indication that there is something seriously unphysical with StdQM connecting to the difficulty of giving StdQM a physical meaning? Seems so.
On the other hand SM within RealQM directly follows from the stability of the Hydrogen atom with potential energy dominated by kinetic energy using the additive form of RealQM with a global wave function as a sum over one-electron wave functions with local non-overlapping supports.
RealQM is a physical model with SM safely mathematically built in. StdQM is an unphysical model with SM basically a mathematical miracle. SM with RealQM could be essential part of even introductory texts/courses in QM.
According the chatGPT, SM is by physicists viewed as "settled" once and for all by Dyson et al, and it is not meaningful to teach the proof since it is so difficult and and non-illuminating. The advice to students appears to be to just accept SM and not ask about any justification. Seems a bit strange...
Summary:
This is a typical progression as concerns fundamental problems in StdQM: 1. State problem as fundamental (interpretation, measurement, complementarity...). 2 Realise that the problem cannot resolved. 3. Claim that there are solutions, but very difficult to understand. 4 Decide that the fundamental problem as been "settled" and that there is noting more to say. 5. Declare that it is sufficient to know that the problem has been solved and that asking for why is not part of physics education.
A fundamental conception of atom physics is that the electrons surrounding an atomic kernel are arranged in a sequence of shells $S_n$ for $n=1,2,3,...$ with $S_n$ containing $2n^2$ electrons when filled, which gives the Periodic Table with periods 2, 8, 8, 18, 18, 32,,, including repetitions.
A fundamental question in Standard Quantum Mechanics StdQM is if the shell structure of the Periodic Table is carried by solutions of the Schrödinger equation for the atom? Can an answer be given when such solutions are uncomputable because they involve $3N$ spatial dimensions for an atom with $N$ electrons?
Standard Quantum Mechanics StdQM based on Schrödinger's equation SE with standard interpretation of a Hamiltonian acting on wave functions with $3N$ spatial dimensions for a system with $N$ electrons, has only statistical meaning and is computable only for very small $N$, thus can be said to be non-physical and uncomputable.
Density Functional Theory DFT seeks to reduce StdQM by averaging 3N-dimensional wave functions into a single electron charge density $\rho (x)$ depending on a 3-dimensional coordinate $x$, and identifying ground states of StdQM with DFT densities satisfying a reduced SE with Hamiltonian only implicitly determined and so has to be approximated. DFT is the main computational method for $N>100$ currently available.
RealQM is based on a different interpretation of the Hamiltonian of SE acting on a wave function $\Psi (x)$ as a sum
of one-electron wave functions $\psi_n(x)$ with non-overlapping supports depending on a common 3d spatial variable, which meet at a Bernoulli free boundary with continuity and zero normal derivative. The corresponding electron charge density $\rho (x)$ is a sum
This is a condensation of the preceding sequence of posts with a clear and simple message.
Stability of bulk matter as a collection of a large number $N$ of atoms, is on a basis of Standard Quantum Mechanics StdQM considered to be very difficult to prove mathematically, as evidenced in the work by Dyson-Lenard and Lieb-Thirring. Stability is expressed by a lower bound on total energy scaling with $N$, making total energy an extensive quantity.
The difficulty is that electrons in StdQM have global support and so in principle can interact with many kernels to give a lower bound scaling with a power of $N$ possibly bigger than 1 with then total energy tending to minus infinity.
On the other hand with RealQM as an alternative to StdQM, stability of bulk matter directly follows from atomic stability which is a simple consequence of a the Hardy inequality: (see this post):
This is a clarification of previous posts on Stability of Matter connecting to the complex mathematical analysis of Lieb-Thirring-Dyson-Lenard.
We noticed that the total energy $E(\Psi )$ of any admissible wave function $\Psi (x)$ for the Hydrogen atom with $Z=N=1$ satisfies
This is a follow up of previous posts on the basic problem of Stability of Matter.
The Lieb-Thirring/Dyson-Lenard proof of the scaling $E(Z) \sim -Z^{\frac{7}{3}}$ of the ground state energy $E(Z)$ of an atom with kernel charge $Z$, is viewed to be a master-piece of rigorous mathematics revealing a deep truth of quantum mechanics.
But empirical observations of total energies of atoms show a slower scaling of $-Z^\alpha$ with $\alpha \approx 1.81$ according to the following compilation by chatGPT:
The Lieb-Thirring proof is based on the Thomas-Fermi model and not the Schrödinger equation of Standard Quantum Mechanics StdQM, suggesting that the Thomas-Fermi model does not capture physics. But there is no theoretical proof based on StdQM as the canonical model.This is a complement to two previous posts.
According to the Bohr model a Hydrogen atom consisting of an electron orbiting a proton kernel, does not have stable existence over time, because a moving electron radiates energy and so looses kinetic energy and falls into the Coulomb potential well of the proton and ceases to exist.
But the orbit of planet around a Sun can have stable existence with negative gravitational potential energy balanced by positive kinetic energy, because the planet is not radiating.
In 1926 Schrödinger invented a new model in terms of a wave function $\Psi (x)$ depending on a 3d space coordinate $x$ with a new form of energy measured by $\vert\nabla\Phi (x)\vert^2$ referred to as "kinetic energy" although spatial gradients without reference to motion was involved. The Hydrogen atom was then represented by a wave function $\Psi (x)$ minimising total energy as Coulomb potential energy + "kinetic energy" with $\Psi^2(x)$ representing electron charge density. Basic Calculus showed the existence of a minimiser as a stable ground state.
Schrödinger's model can alternatively be interpreted in terms of classical continuum mechanics as the equilibrium configuration of a (spherically symmetric) elastic substance in a central potential minimising total energy as potential energy + elastic energy.
Schrödinger's model thus showed existence of a stable ground state of the Hydrogen atom and was greeted as the start of modern physics in the form of quantum mechanics with a first challenge to extend Schrödinger's model to atoms with $N>1$ electrons.
But here history took a strange turn and settled for a wave function depending on $N$ 3d variables thus in total $3N$ spatial variables, thus forming the Standard Quantum Mechanics StdQM as the foundation of modern physics. The more natural extension into $N$ wave functions in a common 3d variable, was thus missed and has been explored only recently in the form of RealQM. See comment by Lieb and Seiringer in this post.
In RealQM stability of atoms with $N>1$ is guaranteed in the same way as in the case $N=1$ by the presence of "kinetic/elastic energy" measured by $\vert\nabla\Phi (x)\vert^2$.
Naming the spatial gradient measure $\vert\nabla\Phi (x)\vert^2$ "kinetic energy", has caused a lot of confusion by suggesting that somehow electrons are moving around the kernel as if Bohr's model was in fact functional, although they cannot do that without radiating into collapse. The electron charge density of the ground state of an atom is stationary in space.
Note that the "kinetic energy" with density $\vert\nabla\Phi (x)\vert^2$ originating from the presence of the Laplacian differential operator in Schrödinger's equation representing a regularisation scaling with Planck's constant, coming with added stability as a familiar tool in the mathematical theory of differential equations, where the specifics of the regularisation does not have larger scales effects. The Schrödinger equation thus expresses a regularised form of Coulomb interaction with independence of the absolute scale of Planck's constant: The World would look the same with a different Planck constant.
A basic task of theoretical physics is to prove stability of matter in the sense that the total energy $E$ of the ground state of an atom is bounded below, which signifies that the electrons surrounding the atomic kernel do not collapse into the kernel with energy going to minus infinity.
In Standard Quantum Mechanics StdQM this is viewed to be difficult to prove with a first 100-page proof by Dyson and Lenard in 1967 followed by a somewhat shorter proof by Lieb and Thirring in 1975 showing a lower bound scaling with $-Z^{\frac{7}{3}}$ for an atom with kernel charge $Z>1$.
For $Z=1$ the proof is straight-forward as recalled in the previous post.
In RealQM the proof for $Z>1$ is essentially the same as for $Z=1$ and so gives a lower bound scaling with $-Z^2$, which is to be expected by a simple dimensional analysis, and also is observed.
In RealQM an atom has a shell structure with stability established successively from innermost shell with the kernel to the next shell with reduced kernel charge and so on. Further, the stability of molecules in the sense of non-collapse naturally follows from stability of atoms.
The fact that stability of matter is so far-fetched in StdQM ($-Z^{\frac{7}{3}}$) and so direct in RealQM ($-Z^2$) gives evidence that RealQM may be closer to real physics than StdQM.
The Lieb-Thirring proof extends to many kernels with total charge Z, while we here think of just one atom or molecule.
Stability of large collections of atoms or molecules ($Z=10^{23}$) is studied in thermodynamics, then without quantum mechanics because it does not make sense,
The famous mathematician Elliot Lieb devotes the book Stability of Matter: Quantum Mechanics (311 pages, together with physicist Robert Seiringer) to prove that the total energy $E(\Psi )$ of Standard Quantum Mechanics StdQM in its $3N$-dimensional form for a neutral atom with $N$ electrons, is bounded below for all admissible wave functions $\Psi$. This means that an atom has a ground state as the solution to an eigenvalue problem with smallest eigenvalue as minimal total energy including electronic kinetic energy (positive) and potential energy (negative), and electron-electron repulsion energy adding to stability.
The case $N=1$ has a direct proof based on a well known Poincaré inequality showing that positive kinetic energy dominates over negative potential energy giving a lower bound to total energy (see below). The rest of the book is devoted to extension to $N>1$ using a Lieb-Thirring inequality and Baxter-Onsager inequality involving a common electron density $\rho (x)$ depending on $x\in R^3$ with $R^3$ 3d Euclidean space.
Let us check out what RealQM as an alternative to StdQM, has to say about stability of matter. We recall that the wave function $\Psi (x)$ of RealQM depends on $x\in R^3$ as above and has the form of a sum
where the $\psi_n(x)$ are real-valued one-electron wave functions with non-overlapping supports $\Omega_n$ in a decomposition of 3d space, which meet a Bernoulli free boundary with continuity and zero normal derivative and with corresponding charge densities $\rho_n(x)=\psi_n (x)^2$ with unit charge:
chatGPT said:
In quantum many-body physics you want the ground–state energy EN,K of a system with
N electrons (charge −e) and
K nuclei with total charge +Ze
to obey a linear lower bound
EN,K≥−C(Z)(N+K)(“stability of the second kind”),i.e. the energy per particle must stay bounded when you add more particles.
Classically this is not obvious – Coulomb attraction grows like N5/3 for a compact cloud – so nothing prevents matter from collapsing. Quantum mechanics gives two tools to fight the collapse:
Heisenberg uncertainty principle – localising electrons costs kinetic energy ∼ℓ−2.
Pauli exclusion principle – antisymmetry forces electrons to occupy ever higher momentum states, adding another N5/3‐type repulsion.
The challenge in the 1960 s was to turn these ideas into a rigorous inequality for all N,K.
Paper | Main statement | Key ingredients |
---|---|---|
F.J. Dyson & A. Lenard, Stability of Matter I, J. Math. Phys. 8 (1967) 423–434 | First proof that a non-relativistic Coulomb system is stable provided the negative charges are fermions. | Cell decomposition of space, local uncertainty inequalities, electrostatic screening estimate. AIP Publishing |
A. Lenard & F.J. Dyson, Stability of Matter II, J. Math. Phys. 9 (1968) 698–711 | Extends the bound to the full neutral system (moving nuclei) and trims technical gaps. | Refined combinatorial counting of cells and careful handling of nucleus–electron correlations. AIP Publishing |
Linear bound
EN,K≥−CDL(N+K),CDL≈1015– enormous, but finite.
Necessity of Fermi statistics
Dyson also proved that bosonic charges are unstable:
EN∼−constN7/5 — energy becomes super-extensive. Wikipedia
Introduced the idea that the Pauli principle provides just enough extra kinetic energy (∫ρ5/3) to counteract the ∫ρ4/3 Coulomb attraction.
Partition space into cubes (“cells”) whose size adapts to the local electron density.
Localise each electron in its cube. Heisenberg then gives a kinetic‐energy cost
Tcell≳(ℏ2/2m)ncell5/3ℓ−2.
Estimate Coulomb energy inside and between cells by an electrostatic inequality that replaces the full interaction with a one-body Hartree term plus controllable error.
Sum over cells and optimise ℓ. The fermionic kinetic term scales like N5/3ℓ−2 while the worst negative Coulomb piece scales like N5/3ℓ−1. Choosing ℓ proportional to N−1/3 leaves a net contribution that is linear in N.
Handle nuclei by showing that electrons screen nuclear charge beyond an O(1) distance, preventing additional collapse.
The proof is long (about 100 dense pages over the two papers) because every step has to be made uniform in N,K.
First complete answer to a question posed by Fisher & Ruelle in 1966 (they famously promised a bottle of champagne for a proof). Wikipedia
Showed that statistics, not just uncertainty, is essential for the existence of ordinary matter.
Provided the technical backbone for later, cleaner approaches – most notably Lieb & Thirring’s 1975 proof, which replaced the cell decomposition by the now-famous Lieb–Thirring kinetic-energy inequality and reduced the constant C to something physically reasonable. Wikipedia
1975 – Lieb–Thirring: conceptual two-line proof using their spectral inequality; constant C∼Z7/3.
1972-2020: Extensions to relativistic electrons, magnetic fields, QED vacuum polarisation, anyons, etc. (many references in the review literature).
Comprehensive modern exposition: Lieb & Seiringer, The Stability of Matter in Quantum Mechanics, CUP 2010.
Dyson and Lenard proved that the simple Hamiltonian
H=i=1∑N2mpi2−i,I∑∣xi−RI∣Ze2+i<j∑∣xi−xj∣e2+I<J∑∣RI−RJ∣Z2e2admits a ground-state energy bounded below by −C(N+K) because electrons are fermions. Without that antisymmetry, matter would collapse. Their papers thus supplied the first mathematically rigorous explanation of why tables, rocks and people do not implode under their own electromagnetic forces.
For when propositions are denied, there is an end
of them, but if they bee allowed, it requireth a
new worke. The Essais of Sr. Francis Bacon, London, 1612
The further a society drifts from the truth, the more it will hate those who speak it. George Orwell
Nothing is created by coincidence, rather there is reason and necessity for everything. Leukippus, 5th Century BC.
De Omnibus Dubitandum
Science advances one funeral at a time.
Max Planck