Entropy is one of the most important concepts in physics. It governs heat engines, chemical reactions, black holes, and even the arrow of time itself. Yet despite its importance, the way entropy is usually explained has always felt strangely unsatisfying. We are told that entropy measures “disorder,” or sometimes that it measures how much information we lack about a system.

But these explanations raise a deeper question: what is entropy physically?

Disorder is not something we can weigh or measure directly. And ignorance — what we happen to know or not know — cannot be a fundamental property of the universe. Physics should describe what is happening in the world itself, not what observers happen to know about it.

This new paper proposes a different answer.

The central idea is simple but powerful: entropy is the physical cost of making a record.


The Universe as a Ledger of Distinctions

Every time something happens in the universe that cannot be undone — a photon is detected, a molecule collides, a measurement outcome is recorded — a distinction has been permanently written into the world.

In this framework, these irreversible distinctions are called committed bits.

A committed bit is not just any physical event. It must satisfy four conditions:

• it must be distinguishable from alternatives
• it must remain stable long enough to influence the future
• it must be impossible to undo without further cost
• and it must be capable of affecting downstream events

When those conditions are satisfied, a new entry has effectively been written into the universe’s causal ledger.

Entropy, in this view, is simply the running total of those entries.

Every committed bit adds a fixed cost — the minimum thermodynamic cost identified by Landauer’s principle — equal to k₍B₎ ln 2 per bit. The total entropy of a system is therefore just the accumulated cost of its irreversible history.

Instead of thinking of entropy as disorder, we can think of it as the accounting system of physical reality.


A New Way to Understand the Arrow of Time

One of the deepest puzzles in physics is why time seems to flow in only one direction.

The standard explanation says that the universe started in a very special low-entropy state and has been drifting toward higher entropy ever since. But this explanation has always felt incomplete. It assumes that time already exists and then explains entropy increasing within it.

The approach taken in this paper turns the logic around.

In the BCB–TPB framework, the ordering of committed records is the fundamental structure. The sequence of irreversible distinctions defines what “after” means.

Time is not assumed as a background parameter. Instead, time emerges from the accumulation of committed events.

If a system generates many committed distinctions, its ledger grows rapidly and it experiences dense sequences of clock ticks. If a system generates almost none — such as a system near equilibrium — its ledger barely grows and its temporal progression effectively slows.

In this way, the familiar flow of time becomes a derived quantity rather than a fundamental one.


Recovering All of Thermodynamics

A new definition of entropy would be meaningless if it failed to reproduce the established results of thermodynamics. One of the most important achievements of this paper is that all of the standard formulas emerge automatically from the committed-bit definition.

From this single definition one can derive:

Boltzmann entropy
 S = k₍B₎ ln Ω

Shannon information entropy

Landauer’s principle for information erasure

the second law of thermodynamics

These results are not inserted as assumptions. They arise as direct consequences of the physical cost of record formation.

In other words, the framework does not replace thermodynamics — it explains why thermodynamics works.


Finite Distinguishability: A New Physical Effect

The paper also incorporates an idea called finite distinguishability.

Standard statistical mechanics assumes that all mathematically distinct microstates of a system are physically distinguishable in principle. But if making distinctions requires committing records, and record capacity is finite, then this assumption cannot always hold.

When the available record capacity of a region begins to saturate — for example near a black hole horizon or at Planck-scale densities — not all theoretical microstates remain physically resolvable.

The result is a suppression of the effective number of distinguishable states:

Ω_FD = Ω^(1 − α)

where α measures how much of the available commitment capacity has been consumed.

Under ordinary conditions α is extremely small, so standard statistical mechanics remains an excellent approximation. But near extreme physical limits the correction becomes significant and may produce measurable deviations.


Why Entropy Appears Everywhere in Physics

Entropy appears across an astonishing range of physical domains:

• thermodynamics
• information theory
• chemical reactions
• quantum measurement
• black hole physics
• cosmology

For decades this ubiquity has been mysterious. Why should the same quantity govern such different phenomena?

The committed-bit framework offers a simple answer: all of these processes involve the formation of irreversible records.

A photon detection, a chemical reaction, a memory write in a computer, and the growth of a black hole horizon are all examples of the same fundamental act: a distinction being written permanently into the causal structure of the universe.

Entropy is the cost of writing that distinction.


From Entropy to Geometry

One of the most intriguing aspects of the framework is that it links thermodynamics to spacetime geometry.

In the companion VERSF spacetime paper, the density of committed records determines the spacetime volume element through the Jeffreys distinguishability density of the record ensemble.

In simple terms, regions where the universe is writing records more densely possess a larger local spacetime measure.

When that density varies from place to place, the geometry of spacetime bends.

This means gravity can be interpreted as the geometric response of spacetime to variations in causal record formation.

Entropy and geometry, in this picture, are two different ways of describing the same underlying physical process.


Why This Paper Is Central to the VERSF Programme

Within the broader VERSF framework, this paper plays a special role.

Many parts of the programme explore how spacetime, gravity, and cosmological structure might emerge from deeper informational principles. But those developments require a clear definition of the physical quantity that links information to physics.

This paper provides that definition.

By identifying entropy with the irreversible cost of record formation, it creates a bridge connecting thermodynamics, information theory, temporal emergence, and spacetime geometry.

In effect, it supplies the thermodynamic backbone of the entire framework.


A Different Way to Think About Entropy

The traditional description of entropy as disorder has always been somewhat metaphorical. The committed-bit framework replaces that metaphor with something concrete.

Entropy is not disorder.

Entropy is the cost of writing history.

Every irreversible event adds another entry to the universe’s ledger, and the growth of that ledger is what we experience as the arrow of time.


Spread the love

Discover more from

Subscribe now to keep reading and get access to the full archive.

Continue reading