For a long time quantum mechanics has carried a famous tension at its core. The equations say superpositions should persist—particles can exist in multiple alternatives at once—yet every real measurement gives a single definite outcome. Textbooks typically handle this by adding a projection rule (collapse) and the Born rule as axioms: useful, correct, but ultimately not explained. Why should nature ever stop evolving smoothly and reversibly and instead produce a permanent classical fact?
This paper proposes a different starting point: collapse is not a special mystery-rule—it’s what happens when a system runs out of representational capacity. The key idea is operational and simple. Physical predictions can’t be infinitely sensitive, infinitely complex, or require infinite resources to compute. In the manuscript this is formalized as the Taylor Limit, which constrains physically admissible prediction functionals to be (i) analytic (smooth), (ii) Lipschitz (stable under small changes), and (iii) effectively finite (dependent on only finitely many degrees of freedom at a given resolution). Those conditions are not exotic—they’re a compact way of saying “real physics has finite precision and finite resources.”
From there the argument becomes sharp. Define a state’s distinguishability load D(ψ): how many Born-resolvable alternatives the system must coherently track at resolution ε0. Define a local reversible capacity C: the maximum load the system + immediate environment can carry while keeping all predictions well-behaved under reversible evolution. The central theorem is then: when D(ψ)D(\psi)D(ψ) exceeds CCC, coherent phase-tracking becomes undefinable within the admissible observable class, forcing irreversible coarse-graining—i.e., fact creation. In plain language: below capacity, quantum interference remains meaningful; above capacity, phase information cannot be stably represented, so the theory is compelled to compress into robust records.
The result is a clean “criticality” picture of reality. If the universe were subcritical, it would never stabilize records—no persistent facts. If it were supercritical, coherence would be crushed immediately—no interference, no entanglement. Only in a critical window can both coexist, which is exactly what we observe. And importantly, this isn’t just philosophical: it makes a concrete experimental prediction. Standard decoherence says interference visibility should degrade smoothly as environmental coupling increases. A capacity-driven model predicts something stronger: across different molecular families, data should collapse onto a universal curve when plotted against the normalized stress parameter Φτ/C\Phi\tau/CΦτ/C (distinguishability influx × interaction time, divided by capacity). If that curve-collapse shows up in large-molecule interferometry, it would be direct empirical support for a capacity-based collapse threshold.
The big takeaway is not that we need to rewrite quantum mechanics. The unitary dynamics can remain intact. The claim is subtler: collapse is the unavoidable consequence of finite representability once complexity exceeds local capacity. The “measurement problem” becomes a question of where the reversible/irreversible boundary sits—and the paper argues that boundary is a real, physical critical surface, not an observer-dependent mystery.
Relation to Prior BCB Work.
The present criticality theorem builds directly on, but is conceptually distinct from, earlier work within the Bit Conservation and Balance (BCB) framework, where quantum measurement was modeled as a finite-time thermodynamic information-transport process. In that setting, collapse was identified with the first-passage time at which one full bit of macroscopic distinguishability irreversibly entered the environment, yielding a quantitative collapse time for thermal environments. The role of the present work is not to re-derive that mechanism, but to explain why any such mechanism must exist in a universe capable of producing stable classical facts. The BCB model provides a concrete dynamical realization—via information currents, pointer-basis selection, and temperature-dependent flux—of the general capacity principle proven here. In this sense, BCB should be understood as a constructive instantiation of the broader criticality theorem, applicable when environmental coupling is thermal and pointer-basis information export is near-optimal (C≈1). The logical direction is thus from structure to mechanism: the criticality theorem explains why collapse thresholds are unavoidable, while BCB explains how they are realized in physically relevant systems.