For more than a century, physics textbooks have explained entropy as a counting problem. Count the “microstates,” take the logarithm, multiply by Boltzmann’s constant — and you get entropy. But this picture has always left a strange gap: why should counting possibilities tell you anything about how the world actually moves?

The Ticks-Per-Bit (TPB) framework fills that gap by flipping the logic on its head. Instead of treating entropy as a static inventory of microstates, TPB treats it as a dynamic efficiency measure:

How many tiny, irreducible “ticks” does it take for a system to make one bit of noticeable change?

A gas full of molecules colliding wildly has low TPB — it generates measurable change quickly, so it has high entropy.
A crystal, vibrating but barely shifting, has high TPB — it takes many microscopic events to change anything, so it has low entropy.

In other words:
Entropy measures how good a system is at turning microscopic motion into macroscopic difference.

This might sound abstract, but it suddenly makes a lot of familiar things intuitive:

  • Hot objects have higher entropy because their microscopic motion is more dynamically “productive.”
  • Information theory fits perfectly because distinguishing a symbol is a physical process with a real tick-cost.
  • Even black holes make sense: near the horizon, ticks pile up so dramatically that the “distinguishability budget” concentrates on the surface — the origin of the famous area law.

But the most striking feature of TPB is that entropy becomes testable in a new way.
The paper predicts that, in supercooled liquids and glass-formers, the relationship between viscosity and configurational entropy should be a straight line, not the curved relationship predicted by older theories. That’s a clear, measurable signature that experimentalists can check.

If this prediction holds — and early evidence hints that it might — then TPB provides the first mechanistic, dynamical foundation for entropy that unifies thermodynamics, information theory, quantum mechanics, and the strange world of glassy materials.

In short:
Maybe entropy was never really about counting. Maybe it was always about motion.

Spread the love

Discover more from

Subscribe now to keep reading and get access to the full archive.

Continue reading