Most physics textbooks begin by assuming that the speed of light is a universal constant and that time is a fundamental background dimension. Einstein took the invariance of the speed of light as a postulate and built special relativity from it. But this paper asks a deeper question: why must such a speed exist at all? Instead of starting with spacetime geometry, the framework begins with something more primitive — irreversible records. The central idea is simple but radical: time is not something that exists in the background; it emerges whenever a distinction is made that cannot be undone. A perfectly frictionless pendulum swinging back and forth does not create time in any operational sense. It merely oscillates. Time advances only when something irreversible happens — when information becomes permanently recorded.
From this perspective, space and time are defined operationally. Space is the minimum separation required for two independent records to remain distinguishable. Time is the minimum interval required to produce one irreversible commitment. The speed of light then emerges as the ratio between these two fundamental quanta: the smallest meaningful unit of space divided by the smallest meaningful unit of time. The paper shows that once you assume locality (information spreads step-by-step), reversibility at the microscopic level, irreversibility at the record level, and consistency under coarse-graining, a finite invariant speed is unavoidable. Even more remarkably, all observers must agree on this speed. The Lorentz transformations of special relativity arise automatically from the requirement that observers preserve the causal ordering of records. In this view, spacetime geometry is not assumed — it emerges from the logic of stable facts.
One of the most striking consequences of this framework is thermodynamic. If time is created only when irreversible records are made, then the entropy cost of keeping time should scale with the number of committed bits, not with every microscopic oscillation. A cesium atomic clock oscillates over nine billion times per second, but it does not create nine billion irreversible events per second. Instead, it produces a small number of stable records that encode the accumulated oscillations. The information content of “9,192,631,770 cycles occurred” is only about 33 bits. That means the thermodynamic cost of keeping one second of time should be suppressed by roughly a billion compared to the naive picture in which every oscillation is irreversible. This is not a philosophical claim — it is a concrete, testable prediction.
The framework also yields a measurable scaling law. The size of a “commitment cell” — the smallest region capable of supporting an independent stable record — depends on temperature. Specifically, it scales inversely with temperature: colder environments require larger regions to stabilize facts, while hotter environments permit faster and smaller commitments. At room temperature, this predicts commitment scales on the order of tens of micrometers and fundamental irreversible bit rates in the picosecond regime. In ultracold Bose–Einstein condensates, where the healing length can be tuned experimentally, the same logic predicts femtosecond-scale commitment cadences. These are not abstract speculations; they are experimentally accessible regimes.
The deepest conceptual result of the paper is the separation between geometry and the arrow of time. The geometric structure of spacetime — the light cone and the invariant speed — arises from reversible dynamics. The arrow of time arises from irreversible commitments. These two structures are logically independent, yet they must agree for reality to be operationally coherent. That agreement is the non-trivial achievement of the framework. It suggests that spacetime may not be the most fundamental layer of physics. The more primitive layer may be the structure of distinguishability itself — the conditions under which facts can exist at all.
General Reader Summary
If this framework turns out to be right, it would change the way we think about the foundations of physics. For over a century, we’ve treated space and time — and especially the speed of light — as basic ingredients of reality. Einstein assumed the speed of light is the same for everyone and built relativity from that starting point. This work suggests something more radical: that the speed limit of the universe might not be a mysterious built-in feature of spacetime at all, but a consequence of something deeper — the rules that make stable facts possible. In this view, space and time are not the stage on which physics happens; they emerge from the way irreversible events create distinguishable records.
That would be a profound shift. It would mean that the structure of spacetime is not the foundation of reality, but a consequence of the logic of information and irreversibility. The light cone — the boundary between what can and cannot influence what — would arise because information can only spread locally and records can only form under certain stability conditions. Even the “flow” of time would no longer be something built into the universe from the start, but something that appears whenever distinctions are permanently made. Instead of asking “why does spacetime have this geometry?”, we would ask “what must reality be like for facts to exist at all?” If that direction of explanation is correct, it would reshape how we understand the relationship between relativity, thermodynamics, and measurement — and redefine what we consider to be truly fundamental in physics.