The Problem
When physicists calculate how much energy empty space should contain, they get a number that’s wrong by a factor of 10¹²⁰. That’s not a small error — it’s a one followed by 120 zeros. The predicted energy would be so enormous that the universe would have collapsed into black holes instantly. Obviously it didn’t. This has been called the worst prediction in the history of science, and it’s been an open problem for decades.
The observed energy of empty space — what astronomers call “dark energy” — is tiny but not zero. It’s causing the universe’s expansion to slowly accelerate. Nobody has been able to explain why it has the value it does.
The Key Insight
You can’t measure distance with one point. You need two points to define any length or relationship. The papers argue this isn’t just practical — it’s fundamental. The smallest meaningful piece of geometry isn’t a point, it’s a connection between two points.
This means the smallest unit of space is twice the Planck length (the theoretical minimum), not the Planck length itself.
What the Four Papers Do
Paper 1 (Two-Planck Principle) shows that this simple insight, combined with one physical requirement — that empty space can’t collapse into black holes — leads to a specific prediction. Space has a natural “grain size” of about 100 micrometers (roughly a hair’s width). This grain size determines how much energy empty space contains. The prediction matches observation to within 20%.
Paper 2 (Relational Geometry) answers the skeptic who says “you just picked numbers to get the right answer.” It shows that the key inputs aren’t choices — they’re forced by the geometry. The factor of two comes from what relations are. The saturation condition comes from stability. The number seven (which appears in the calculation) comes from counting how many independent conditions a triangular piece of space must satisfy.
Paper 3 (Structural Closure) proves those claims rigorously. It verifies the number seven using two completely independent mathematical methods. It proves that space must fill to saturation — it’s not an assumption but an inevitable outcome. It shows that triangular building blocks are the minimal structure that works.
Paper 4 (Microphysical Foundations) rigorously derives the microphysical inputs behind the Two-Planck framework’s dimensional-transmutation prediction of the coherence scale.
What Makes This Different
Other approaches either:
- Give up and say it’s random (string landscape)
- Set an upper bound but don’t predict the value (anthropic arguments)
- Need to assume something that’s really just Λ in disguise (circular)
This framework takes one measured input — how fast the universe expands at different times — and derives the energy density of empty space. No fitting to the answer. The structure is tight enough that if the microphysics gave a different grain size, the whole thing would fail. It doesn’t.
The Bottom Line
The papers claim to have solved a 120-order-of-magnitude discrepancy using geometry. The only thing you have to measure is the expansion history. Everything else — including why empty space has exactly the energy density it does — follows from the mathematics of relations.
To our knowledge, no current approach derives the cosmological constant from a specific, countable set of tiny geometric building rules, where the size of the vacuum energy falls out automatically from the structure itself instead of being tuned to match observations.
Many important and sophisticated approaches have tried to explain the cosmological constant — including ideas based on quantum field theory, symmetry principles, vacuum energy cancellations, holography, emergent gravity, and even multiverse or anthropic reasoning. These efforts are often deep and mathematically rich, but they typically either leave the value of the cosmological constant as an adjustable parameter, treat it as an integration constant, or rely on statistical or environmental selection arguments.
What distinguishes the present approach is the claim that the observed scale arises directly from a small, countable set of underlying geometric constraints, with the value emerging through dimensional transmutation rather than tuning.