A century of physics for objects that don't behave like objects. From the 1900 Planck quantum to the 2022 Bell Nobel — what the theory says, what it means, and what is still genuinely open.
The mathematical theory that correctly predicts the behaviour of atoms, molecules, photons, and everything built from them. Also: a philosophical mess.
Quantum mechanics is the most empirically successful theory in the history of physics. Predictions confirmed to twelve decimal places. The transistor, the laser, the LED, the MRI machine, the GPS satellite — all depend on quantum mechanics behaving as predicted.
It is also the theory whose interpretation has been argued for a hundred years and is no closer to settlement. What the wave function "is", what happens during measurement, whether reality is local — these remain open. Sean Carroll, "We don't understand quantum mechanics."
This deck covers the empirical content (what the theory predicts), the formal content (what the equations say), the historical development, the experiments, and the live interpretive debates.
December 14, 1900. Max Planck presents to the German Physical Society a derivation of the blackbody radiation spectrum. To make the math work, he assumes energy is exchanged in discrete units — quanta — of size E = hν, where h is a new constant.
E = hν h = 6.626 × 10⁻³⁴ J·s
Planck himself thought the quantisation was a mathematical trick, not physical reality. Einstein's 1905 paper on the photoelectric effect treated it as physical: light is genuinely particulate, in packets of energy hν. Einstein's Nobel was for this work, not relativity.
The decade 1900–1925 was the "old quantum theory" — Planck, Einstein, and Bohr made progress with what amounted to ad-hoc additions to classical physics. The theoretical foundations came later.
Niels Bohr's 1913 model of the hydrogen atom proposed that electrons orbit only at specific radii where their angular momentum is an integer multiple of ℏ = h/2π. Transitions between orbits emit or absorb photons of definite energy — explaining the Balmer spectrum of hydrogen with stunning precision.
L = nℏ, n = 1, 2, 3, ...
The Bohr model was wrong about almost everything (electrons don't have classical orbits) but right about the discrete energy levels. The 1922 Nobel followed.
The Copenhagen Institute Bohr founded became the centre of theoretical physics for a generation. Heisenberg, Pauli, Dirac, Schrödinger all visited or worked there. The "Copenhagen interpretation" is named for it.
The mature theory came in two years. Werner Heisenberg's 1925 matrix mechanics — observables become non-commuting matrices. Erwin Schrödinger's 1926 wave equation — particles described by a wave function ψ evolving according to a deterministic equation:
iℏ ∂ψ/∂t = Ĥψ
The two formulations were initially thought rival. Schrödinger himself proved (1926) they're mathematically equivalent. Paul Dirac's 1930 textbook unified them and added the relativistic version (the Dirac equation) that predicted antimatter four years before its experimental discovery.
Max Born's probability interpretation (1926): |ψ|² is the probability density of finding the particle. Born's Nobel didn't come until 1954.
Heisenberg's uncertainty principle (1927): for non-commuting observables (position and momentum, energy and time), there is a fundamental limit to how precisely both can be known.
ΔxΔp ≥ ℏ/2
The textbook demonstration. Fire electrons (or photons, neutrons, large molecules) one at a time at a barrier with two slits. Each one lands as a single point on the detector — particle behaviour. After many particles, an interference pattern emerges — wave behaviour.
Richard Feynman called it "a phenomenon which is impossible, absolutely impossible, to explain in any classical way, and which has in it the heart of quantum mechanics."
The interference disappears if you measure which slit each particle passes through — the measurement destroys the superposition. Quantum behaviour requires preserving the superposition until the final detection.
Buckminsterfullerene (C₆₀, 60-atom molecules) shows interference (Markus Arndt, 1999). The current record is molecules of ~25,000 atomic mass units (Eibenberger et al. 2019). Whether macroscopic objects can interfere is principally limited by decoherence, not principle.
Modern quantum mechanics rests on five postulates:
1. State vector. A physical system's state is fully described by a vector ψ in a complex Hilbert space.
2. Observables. Each measurable quantity corresponds to a Hermitian operator. The eigenvalues of the operator are the possible measurement outcomes.
3. Born rule. The probability of measuring a given eigenvalue is the squared amplitude of ψ projected onto the corresponding eigenstate.
4. Schrödinger evolution. Between measurements, the state evolves unitarily: iℏ ∂ψ/∂t = Ĥψ.
5. Measurement collapse. Upon measurement, the state collapses to the eigenstate corresponding to the observed eigenvalue.
Postulates 4 and 5 contradict each other. Unitary evolution is deterministic and reversible; collapse is stochastic and irreversible. The "measurement problem" is asking: when does Schrödinger evolution stop and Born collapse begin?
Quantum systems can exist in superpositions of distinct states. An electron can be "spin up" and "spin down" simultaneously, in some weighted combination. Until measured, it has no definite spin.
This is not "we don't know which" classical ignorance. It is a real physical state in which both spin values are simultaneously potential. The interference pattern in the double-slit experiment requires this — the electron must take both paths at once.
The most famous illustration: Schrödinger's cat (1935). A radioactive atom in a superposition of decayed/not-decayed is coupled to a vial of poison via a Geiger counter. The cat ends up in a superposition of alive/dead. Schrödinger meant this as a reductio ad absurdum — surely cats aren't superposed. But the math says they are, until measured.
If two particles interact, their states can become entangled — they no longer have separate descriptions. Measuring one immediately determines the other, no matter how far apart.
Einstein, Podolsky, and Rosen (1935) argued this was a problem: entanglement implies "spooky action at a distance" or hidden variables that classical physics doesn't see. They concluded quantum mechanics must be incomplete.
For thirty years no one knew how to test this. Then John Bell.
|ψ⟩ = (1/√2)(|↑↓⟩ - |↓↑⟩)
The singlet state — two particles whose total spin is zero. Measure one as up, the other is down. Always. The correlations exceed what any local hidden-variable theory can produce. Bell's theorem (next page) proves this rigorously.
John Bell, 1964 paper "On the Einstein-Podolsky-Rosen paradox." Bell proved that any local realistic theory makes predictions that contradict quantum mechanics — and that the contradiction is testable.
The core inequality: a sum of correlations between measurement outcomes at distant locations is bounded above by some value. Quantum mechanics predicts violations of this bound; local hidden-variable theories cannot.
Alain Aspect's 1982 experiments at Orsay were the first decisive empirical test — and quantum mechanics won. Subsequent loophole-free tests (Hensen et al. 2015, Giustina et al. 2015) closed the remaining experimental escape routes.
2022 Nobel in Physics: Aspect, John Clauser, Anton Zeilinger, "for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science."
The world is non-local. Or non-realist. Or both. Local realism is empirically dead.
When does the wave function collapse? The Schrödinger equation describes deterministic, unitary, reversible evolution. The Born rule describes stochastic, irreversible collapse. They cannot both apply at the same time. So when does one stop and the other begin?
Possible answers, none fully satisfying:
The Copenhagen view (Bohr, Heisenberg): collapse happens at the boundary between quantum system and "classical" measurement apparatus. But where is this boundary? Bohr was deliberately vague.
Von Neumann's chain (1932): the apparatus is itself quantum and gets entangled with the system. So is the human observer. Eventually consciousness collapses it. (Wigner endorsed this in 1961, then walked it back.)
Decoherence (Zurek, 1980s onward): interaction with the environment rapidly suppresses superposition into apparent classical behaviour. This explains why we don't see macroscopic superpositions. It does not explain why a definite outcome occurs.
The measurement problem remains open. Most working physicists ignore it ("shut up and calculate," attributed to Mermin or Feynman); a substantial minority think it points at something deep.
The dominant interpretation through most of the 20th century. Associated with Bohr and Heisenberg though never crisply written down by either.
Key claims: (1) The wave function describes our knowledge or our ability to predict, not reality. (2) Quantum systems don't have definite values for unmeasured observables. (3) Classical apparatus is required to define measurement outcomes. (4) Complementarity — wave and particle pictures are complementary aspects, not simultaneously applicable.
The interpretation has been criticised as anti-realist, vague, and paradoxical (the "shifty split" between quantum system and classical apparatus). It remains the implicit default in most working physics. It has lost ground in foundations-of-physics circles to many-worlds and to QBism.
Hugh Everett III's 1957 PhD thesis proposed that the wave function never collapses. Instead, the apparatus and observer become entangled with the system; the universe branches into all possible outcomes, each branch occupied by a copy of the observer who saw that outcome.
The advantage: no measurement problem, no need for a classical/quantum split. Only Schrödinger evolution, applied to everything.
The cost: a vast multiplicity of "worlds" we never see. And the Born rule — why do we observe outcomes with probabilities matching |ψ|² — has to be derived from somewhere. Several attempts (Deutsch, Wallace) try to derive it from decision theory; the case is contested.
Many worlds was marginal until the 1990s when Deutsch, Wallace, and others reformulated it. It now has substantial support among physicists and cosmologists. Sean Carroll's Something Deeply Hidden (2019) is the readable popularisation.
The deterministic alternative. Louis de Broglie proposed it in 1927; David Bohm revived and developed it in 1952. Particles have definite positions at all times, guided by the wave function (the "pilot wave").
Predictions match standard quantum mechanics exactly. The double-slit interference happens because the pilot wave goes through both slits, guiding the particle through one.
The cost: explicit non-locality (the wave function is defined on configuration space, not 3D space; the guidance equation gives instantaneous action across distances). Special relativity is hard to integrate.
Bohmian mechanics is a minority view among physicists but has serious advocates (Sheldon Goldstein, Tim Maudlin, the Bohmian school). It remains a live alternative.
QBism (Quantum Bayesianism) treats the wave function as a tool an agent uses to update their beliefs about measurement outcomes. The wave function is not "out there" in nature; it is in the agent's head.
Developed by Carlton Caves, Christopher Fuchs, and Rüdiger Schack from the late 1990s. The view is radically anti-realist about the wave function but realist about measurement outcomes, which are personal, agent-relative facts.
The advantages: no measurement problem (collapse is just Bayesian update); no many-worlds proliferation; clean handling of probability. The disadvantages: difficulty saying what reality is when no one is observing it.
QBism has growing influence in foundations-of-physics circles though remains a minority view among working physicists.
Why do macroscopic objects behave classically? Decoherence theory (Zurek, Zeh, Joos, from the 1970s onward) provides a partial answer.
A quantum system in superposition rapidly becomes entangled with its environment (air molecules, photons, vibrations). The system + environment together remain in a coherent superposition, but if you only have access to the system, you can no longer see interference effects — they are diffused into the environment.
Decoherence times for macroscopic objects are extraordinarily fast — for a dust grain in air, ~10⁻³¹ seconds. For a single electron in a vacuum, very slow. This explains why classicality emerges at the macroscopic scale even though the underlying physics is quantum.
What decoherence does NOT do: solve the measurement problem. It explains how superposition becomes practically invisible, not how a definite outcome is selected.
Non-relativistic quantum mechanics describes a fixed number of particles. Quantum field theory generalises to a variable number, treating particles as excitations of underlying fields.
The major successes:
QED (quantum electrodynamics, Feynman, Schwinger, Tomonaga, 1940s). The electron's anomalous magnetic moment predicted to twelve decimal places — the most accurately tested prediction in physics.
QCD (quantum chromodynamics, the theory of the strong nuclear force). Gross, Wilczek, Politzer's asymptotic freedom (1973, Nobel 2004).
The Standard Model (1970s). The unification of QED, the weak force, and QCD. Higgs mechanism predicted 1964, Higgs boson detected at the LHC 2012, Englert-Higgs Nobel 2013.
QFT is the deepest empirically successful framework physics has. It is also the framework with the worst infinities, requiring renormalisation procedures whose interpretive status is contested.
Richard Feynman's 1982 observation: simulating quantum systems on classical computers is exponentially expensive. Quantum systems should be simulated on quantum computers.
The qubit — the quantum analogue of the bit — can be in superposition of |0⟩ and |1⟩. n qubits can encode 2ⁿ amplitudes. Some algorithms (Shor's factoring, Grover's search) achieve exponential or quadratic speedups over classical.
The contemporary state (2026): quantum computers exist with ~1000 physical qubits but ~50 logical qubits after error correction. Google's Willow chip (2024) was the first to demonstrate that error correction below the threshold actually helps. IBM, IonQ, Quantinuum, Atom Computing, QuEra all have working machines.
The expected timeline to cryptographically-relevant quantum computing (breaking RSA-2048) is uncertain but estimates have shortened. Most cryptographers expect post-quantum migration to be necessary in the 2030s.
Quantum sensing. Atomic clocks (the SI second is defined by Cs-133 transitions). Optical lattice clocks at 10⁻¹⁹ precision. SQUID magnetometers. NV-centre diamond sensors. Atom interferometers for inertial sensing.
Quantum communication. Quantum key distribution (BB84 protocol, 1984). Commercial QKD networks (China's 4,600 km Beijing-Shanghai trunk; the Micius satellite). Toward a quantum internet.
Quantum simulation. Cold-atom systems used to simulate condensed-matter Hamiltonians too hard for classical computers. The Bose-Hubbard model, the Hubbard model. Important for materials discovery.
Quantum computing. The most-discussed but currently the most-distant practical application.
Electrons (and other fermions) have an intrinsic angular momentum called spin, with magnitude ℏ/2. There is no classical analogue — it isn't actual rotation.
Spin is the source of:
The Pauli exclusion principle (no two fermions in identical quantum states) — explains atomic shell structure, the periodic table, why matter doesn't collapse.
Magnetism — ferromagnetism arises from aligned electron spins.
NMR/MRI — manipulating proton spins with magnetic fields. The Bloch equations, MRI imaging, NMR spectroscopy in chemistry.
Spintronics — using spin rather than charge to carry information. Disk drives use giant magnetoresistance (Fert and Grünberg, Nobel 2007).
Stern-Gerlach (1922) was the first experimental demonstration that spin is quantised. Pauli proposed the formal description in 1925. Dirac's 1928 equation derived spin from relativistic invariance.
Classically, a particle without enough energy to overcome a barrier cannot pass it. Quantum mechanically, the wave function extends past the barrier with exponentially decreasing amplitude — there is a finite probability of finding the particle on the other side.
Tunneling matters:
Alpha decay (Gamow, 1928). Alpha particles tunnel out of nuclei.
Stellar fusion. Protons in the Sun's core tunnel through their mutual Coulomb repulsion to fuse. Without tunneling, the Sun wouldn't shine — its core temperature is too low for classical fusion.
Scanning tunneling microscope (Binnig and Rohrer, 1981, Nobel 1986). Imaging at atomic resolution by tunneling current between tip and sample.
Flash memory, tunnel diodes, Josephson junctions (the basis of superconducting qubits).
Tunneling rates are exponentially sensitive to barrier height and width — small changes in geometry produce massive changes in current. This makes tunnel-based devices both useful and finicky.
Quantum mechanics is the foundation of chemistry. Atomic orbitals (s, p, d, f) are solutions of the Schrödinger equation for the hydrogen atom. The periodic table's structure follows from Pauli exclusion and orbital filling.
Chemical bonding emerges from electron sharing between atoms. The simplest case (H₂⁺) is exactly solvable; everything else requires approximation. The Hartree-Fock method, density functional theory (DFT, the basis of most contemporary computational chemistry).
Walter Kohn's 1964 paper on DFT (Nobel 1998) is among the most-cited papers in physics history. It's the reason computational chemistry can model molecules at all.
Quantum chemistry is now a major branch of computational science. Materials Project (Berkeley, since 2011) catalogues DFT-computed properties of ~150,000 inorganic compounds. AI-assisted DFT (Google's GNoME, 2023) found 2.2M new stable materials computationally.
Quantum mechanics meets cosmology in several places:
Inflation. The cosmic microwave background's ~10⁻⁵ density perturbations are interpreted as quantum fluctuations during inflation, stretched to cosmic scales. The 2014 BICEP2 claim of primordial gravitational waves was retracted, but the broader inflation framework holds.
Hawking radiation. Stephen Hawking's 1974 calculation: black holes radiate due to quantum field effects near the horizon. Black holes have temperature, entropy, evaporate over time. Hawking didn't get the Nobel because the radiation hasn't been directly observed; the result is theoretically nearly certain.
The black-hole information paradox. Information falling into a black hole appears lost when the hole evaporates. This contradicts unitary quantum evolution. The 2019-2020 "island formula" results (Penington, Almheiri et al.) appear to resolve it within string theory.
Quantum gravity. The unsolved problem. String theory, loop quantum gravity, asymptotic safety, causal-set theory, holographic approaches. None confirmed empirically.
1. The measurement problem. What collapses the wave function — if anything? Decoherence, many worlds, GRW, or something else?
2. Born rule derivation. Why probabilities = |ψ|²? Is this a postulate or derivable from deeper principles?
3. Quantum gravity. How does QM combine with general relativity? Two of the most successful theories ever, and they don't talk.
4. The vacuum-energy problem. QFT predicts a vacuum energy 120 orders of magnitude larger than the observed cosmological constant. The "worst prediction in physics."
5. Why these laws? The fundamental constants (fine structure, Higgs mass, etc.) take values that allow chemistry, stars, life. Anthropic principle? Multiverse selection? Something deeper?
The 21st-century physicist's job is to make progress on at least some of these. Most progress so far has been incremental.
↑ Jim Al-Khalili explains the double-slit experiment
Watch · Quantum mechanics in simple words
Watch · Bell's theorem · the quantum Venn diagram paradox
Three paths depending on background:
For non-physicists. Sean Carroll's Something Deeply Hidden (2019) and the Mindscape podcast. David Albert's Quantum Mechanics and Experience (1992) is the most-careful philosophy-friendly introduction. Carlo Rovelli's Helgoland (2020) is short and lyrical.
For physics undergraduates. The Feynman Lectures Vol. III. Griffiths's Introduction to Quantum Mechanics (the standard course text). Then Sakurai's Modern Quantum Mechanics for the more abstract second course.
For graduates and rigorous learners. Cohen-Tannoudji's two-volume Quantum Mechanics. Weinberg's Lectures on Quantum Mechanics. For QFT: Peskin & Schroeder, then Weinberg's three-volume Quantum Theory of Fields.
For interpretation. Bell's Speakable and Unspeakable in Quantum Mechanics. Maudlin's Quantum Non-Locality and Relativity. Wallace's The Emergent Multiverse.
Three reasons.
It works. The most-tested theory in physics, by orders of magnitude. Thousands of experiments at very different scales, all consistent. Whatever the foundations look like, the predictive structure is right.
It runs your world. Transistors, lasers, LEDs, MRI, GPS, NMR spectroscopy, photovoltaics, atomic clocks. The 20th-century material economy is largely a quantum economy. The 21st-century quantum technology stack (sensing, simulation, computing, communication) is a continuation.
It is genuinely strange. A century in, working physicists still disagree on what it means. Most disciplines reach interpretive consensus eventually; quantum mechanics has not. That is itself instructive — it tells us that the gap between empirical predictive structure and underlying ontology can persist indefinitely.
The interpretive question matters because it bears on what we think reality is. The empirical question matters because reality, whatever it is, is built from quantum mechanics.
Four directions worth watching.
Quantum-computing scaling. The current challenge is not "build a single qubit" but "scale to thousands of logical qubits with consistent error rates." Google, IBM, IonQ, and the Chinese consortia all have credible roadmaps. The first cryptographically-relevant quantum computer is plausibly 2030–2035; everything depends on error-correction overhead.
Quantum simulation. Probably the most likely near-term economic application. Drug discovery, materials, catalysis. The cold-atom platforms (QuEra, Pasqal, Atom Computing) are leading.
Quantum gravity progress. The 2019-2020 "island formula" suggests genuine progress on the black-hole information paradox. AdS/CFT remains the only fully consistent theory of quantum gravity, but it doesn't describe our universe.
Foundations. The interpretive debate continues. The 2022 Bell Nobel and Aspect's recent work suggest that loophole-free experimental tests will continue to constrain the foundational space. Many-worlds and QBism are gaining; pilot wave is holding.
Quantum Mechanics — Volume III, Deck 12 of The Deck Catalog. Set in Tiempos Text with monospace metadata. Dark mode #0e0e1a; cyan and violet accents.
Twenty-eight leaves on the most empirically successful and most philosophically contested theory in physics. The math is clear. The ontology is not.
↑ Vol. III · Sci. · Deck 12