Quantum computing has a reputation for being strange not just because of what it can do, but because of how many special rules it seems to require. We’re told that quantum systems evolve smoothly according to one kind of mathematics — linear algebra, matrices, rotations — and then, suddenly, when we “measure” the system, a completely different rule takes over. Probabilities are computed using a special formula. The state “collapses.” The mathematics changes character.
This split has always been treated as unavoidable: unitary evolution on one side, measurement on the other. But what if that division isn’t fundamental at all?
The key result of this work is that quantum computing uses more rules than it actually needs — and that several of those rules exist only because measurement has been treated as a mathematical axiom instead of a physical process.
One kind of math, pretending to be two
In standard quantum computing, we are taught to think in two modes:
- Before measurement: quantum states evolve via unitary operations (gates), interference, and amplitudes.
- At measurement: a special probability rule is applied, the state “collapses,” and a different update rule takes over.
These are presented as fundamentally different kinds of mathematics. But if you step back and look at what the unitary part is really doing, a simpler picture emerges.
Before measurement, all quantum operations do the same thing:
they reshape how likely each possible outcome will be if a measurement eventually occurs. That’s it. Gates don’t create results. They don’t create facts. They only redistribute weight among possible futures.
Once you see that, the need for a separate “measurement mathematics” starts to look suspicious.
Measurement isn’t special math — it’s a physical event
The central shift in perspective is to treat measurement not as a new rule, but as a physical process.
A real detector isn’t an abstract projection operator. It’s a device with a small number of possible outcomes — clicks, flashes, bits — each corresponding to a physical state the detector can irreversibly fall into. Those outcomes compete to happen. Whichever one happens first becomes the recorded fact.
When measurement is understood this way, something striking happens:
the probabilities of outcomes no longer need to be postulated. They are simply the result of competition between physical processes.
And when competing processes race to occur, the probability that one wins is just its rate divided by the total rate. No extra rule required.
This single physical mechanism replaces:
- the Born probability rule,
- the collapse postulate,
- and much of the special bookkeeping used for mid-circuit measurements.
Several rules collapse into one.
Why this simplifies quantum computing
This doesn’t make the linear algebra go away — quantum computing still uses vectors and matrices. But it simplifies the structure of the theory.
- There is no longer a need to switch mathematical frameworks when measurement happens.
- Mid-circuit measurement becomes straightforward: a fact is created, and future operations are conditioned on it.
- Generalized measurements (POVMs) stop being abstract constructions and instead reflect how detectors are actually built.
In other words, quantum computing doesn’t require two kinds of mathematics — it requires one kind of mathematics plus one physical event.
A simpler story, not a weaker one
Importantly, this simplification doesn’t weaken quantum computing or change its predictions. All standard results are recovered. Entanglement works the same way. Quantum algorithms behave the same way. Bell inequalities are still violated.
What changes is the explanation.
Instead of saying “this is just how quantum mechanics works,” we can now say:
These rules exist because real detectors have to make irreversible choices.
That’s a simpler story. And in physics, simpler stories that still explain everything usually mean we’ve understood something real.