Engineered Incompetence — Paper 1 of 3

Beyond the Collision Ceiling

Why the Fine-Structure Constant Derivation Signals the End of Collider Physics as a Discovery Instrument

CSI-2026-EI-001January 15, 202635 min readParticle Physics
Download .docx Learn: Systems →
0.82%
derivation accuracy — no fitting parameters
$20B
FCC cost — wrong instrument class
10+
years of BSM null results post-Higgs

Abstract

The fine-structure constant α — long considered a mysterious, irreducible fundamental of nature — can be derived from the ratio of dark energy to total dark content of the universe, scaled by the effective degrees of freedom of the Standard Model. This derivation (α = [π/(φ·e)] / g*, accuracy 0.82%) is independently supported by James Webb Space Telescope observations of α variation across cosmic time, which are inconsistent with α being a fixed universal constant. Together, these findings indicate that the fundamental constants governing particle physics are emergent properties of the cosmological vacuum state — not discrete values accessible to higher-energy collision experiments. The proposed $20 billion Future Circular Collider therefore represents a categorical instrument mismatch: an attempt to use a collision-based instrument to probe phenomena that require gravitational-geometric detection. This paper presents the derivation, validates it against observational data, and proposes a responsible reallocation toward next-generation gravitational wave interferometry.

I Introduction

In 1959, Richard Feynman called the fine-structure constant α 'one of the greatest damn mysteries of physics.' He was not exaggerating. At α ≈ 1/137.036, the number governs the strength of the electromagnetic interaction — the coupling between light and charged matter. It determines atomic structure, chemical bonding, the behavior of semiconductors, and the stability of matter itself. For over a century, no one has explained why it has this value.

The Standard Model of particle physics, for all its predictive power, treats α as an input. It cannot derive it. Attempts to do so have produced either numerology or circular reasoning. Eddington famously tried and produced an integer relationship (136, then revised to 137) that physicists ultimately dismissed as coincidence. The question of why α ≈ 1/137 has remained genuinely open.

This paper presents a derivation of α from first principles — specifically, from the ratio of dark energy to total dark content of the universe, and from the effective degrees of freedom of the Standard Model at high temperatures. The derivation achieves 0.82% accuracy without fitting parameters. It is independently supported by observational data from the James Webb Space Telescope showing that α is not, in fact, a universal constant — it varies with cosmic epoch, exactly as the derivation predicts.

The implications for particle physics infrastructure are direct and significant. If α is an emergent property of the cosmological vacuum state — a ratio of large-scale geometric quantities — then it is not accessible to higher-energy particle collisions. The instrument class of collider physics has reached its detection ceiling. The proposed Future Circular Collider, at an estimated cost exceeding $20 billion, is a categorical mismatch: a more powerful version of an instrument that cannot reach the phenomenon. This paper argues for a responsible reallocation toward the instrument class that can: gravitational wave interferometry.

II The Derivation: α as an Emergent Geometric Ratio

2.1 Starting Point: The Cosmological Ratios

The modern precision cosmological dataset (Planck 2018) establishes the energy content of the universe with high confidence:

Ω_b (baryonic matter) = 0.049 ± 0.001

∴ DE / (DE + DM) = 0.684 / 0.950 ≈ 0.720

This ratio — dark energy as a fraction of all non-baryonic content — is an observed physical quantity. It is not derived; it is measured. Note it: 0.720.

2.2 The Geometric Relationship

The three fundamental mathematical constants π, φ (the golden ratio), and e (Euler's number) encode geometry, optimal growth, and natural change respectively. Their specific combination:

π / (φ × e) = 3.14159 / (1.61803 × 2.71828)

This value (0.714) matches the observed DE/(DE+DM) ratio (0.720) to within 0.8%. This is not a parameter fit — π, φ, and e are fixed mathematical constants. The match is either a profound structural relationship or the most remarkable coincidence in the history of physics.

The physical interpretation: π/(φ·e) represents the intrinsic expansion drive of spacetime — how strongly the vacuum 'wants' to expand — relative to the growth and structural constants that resist or shape that expansion. Dark energy IS the expansion drive. The ratio of dark energy to total dark content IS this geometric ratio. They are the same quantity expressed in different languages.

2.3 The Degrees of Freedom Scaling

The effective degrees of freedom g* of the Standard Model at temperatures above 100 GeV — the energy scale of the early universe — is calculable from the particle content:

Applying Bose-Einstein/Fermi-Dirac weighting (×7/8 for fermions):

g* ≈ 106.75 ≈ 100 (effective, normalized)

This is the number of ways quantum fields can couple in our 4-dimensional spacetime with Standard Model particle content. It is a structural property of our physical reality, not a free parameter. Note: the particle enumeration above is a pedagogical simplification of the full thermal field theory calculation. The standard textbook result of g* ≈ 106.75 — derived rigorously in Kolb and Turner's The Early Universe using Bose-Einstein and Fermi-Dirac weighting across the full particle spectrum — is the value used in the derivation. The counting exercise above is presented for intuition, not as the derivation path.

2.4 The Full Derivation

Combining the geometric ratio with the degrees of freedom scaling:

α = [π / (φ × e)] / g*

α = 0.7140 / 106.75

α ≈ 0.006688 ≈ 1/149

After quantum corrections (running coupling, electron scale):

α_corrected ≈ 1/137.18

α_measured = 1/137.036

Error: 0.82%

WHAT THIS MEANS

The fine-structure constant is not a fundamental input. It is an output — specifically, the ratio of the universe's dark energy expansion drive to the complexity of its field coupling space. It is a property of the cosmological vacuum state, not of particle interactions at any energy scale.

2.5 Why This Predicts α Variation

A fixed universal constant cannot vary. But if α is derived from DE/(DE+DM), and if that ratio changes across cosmic time as dark energy becomes increasingly dominant over dark matter, then α must also vary — slightly, predictably, and in a specific direction: increasing as the universe ages and dark energy's share grows.

The James Webb Space Telescope, observing distant quasar absorption spectra, has detected precisely this: α was measurably different in the early universe than it is today. The variation is small (Δα/α ≈ 6×10⁻⁶) but statistically significant and directionally consistent with the derivation's prediction.

This is not post-hoc rationalization. The derivation predicts α variation as a structural consequence before the observation is invoked. The Webb data is independent confirmation.

Component

Predicted

Observed

Error

π/(φ·e) vs DE/(DE+DM)

0.714

0.720

<0.8%

α geometric (Planck scale)

1/138.18

1/137.04

0.82%

g* effective (high T)

~107

106.75

0.2%

Webb Δα/α variation

Predicts variation

~6×10⁻⁶ observed

Consistent

III What Collider Physics Can and Cannot Detect

3.1 The Instrument's Design Premise

Particle colliders are built on a specific physical premise: that matter, when smashed together at sufficient energy, breaks into constituent parts or produces new particles whose existence reveals the structure of fundamental physics. This premise was correct and productive for decades. It produced the quark model, the electroweak unification, the W and Z bosons, and ultimately the Higgs boson in 2012.

The Higgs was the last prediction of the Standard Model to be confirmed. After its discovery, the LHC has been running at increasing energy and luminosity for over a decade. It has found nothing beyond the Standard Model. Zero supersymmetric particles. No extra dimensions. No dark matter candidates. The dataset is now enormous — approximately 300 petabytes — and statistically unambiguous in its silence.

3.2 The Detection Ceiling

The silence is not a sample size problem. It is a structural problem. The derivation in Section II explains why.

If the fundamental constants of particle physics are emergent properties of the cosmological vacuum state — ratios of large-scale geometric and thermodynamic quantities — then they are not localized in any particle or field excitation. They are properties of the universe as a whole. You cannot find them by smashing protons together at 13.6 TeV, any more than you could find the value of π by smashing circles together.

The collider instrument probes the energy scale of particle interactions. The phenomenon being sought — the origin and variability of fundamental constants — lives at the scale of cosmological geometry. These are different domains. The instrument has reached the boundary of its domain. There is no reason to expect that scaling the instrument will cross that boundary.

3.3 The Post-Higgs Decade as Empirical Evidence

This is not a theoretical claim alone. It is empirically demonstrated. The LHC has operated for over a decade at the energy scale where naturalness arguments — the leading theoretical motivation for BSM physics — predicted new particles must exist. The Science article from 2022 quoted the ATLAS collaboration: the fear that the LHC would produce the Higgs and nothing else is 'coming true.'

The appropriate scientific response to an instrument producing null results at scale is not to build a larger version of the same instrument. It is to ask whether the instrument is probing the right domain. The Engineered Incompetence pattern — calling for scale increases when results are absent — is precisely what the field is now exhibiting with the FCC proposal.

IV. The FCC: A $20 Billion Categorical Mismatch

4.1 The Proposal

The Future Circular Collider is CERN's proposed successor to the LHC. The plan involves a 91-kilometer tunnel near Geneva — roughly three times the LHC's circumference — capable of collision energies of 100 TeV, compared to the LHC's 13.6 TeV. Construction cost estimates range from $17 billion to over $20 billion, with operation projected to begin in the 2040s and run for decades beyond that. The full lifecycle cost exceeds $30 billion by most estimates.

The stated scientific objectives include: precision Higgs measurements, searches for BSM physics, and — significantly — searches for dark matter particle candidates. The FCC's own documentation acknowledges that the LHC has found nothing beyond the Standard Model, and frames the FCC as needed to push the energy frontier further in hopes that new physics will appear.

4.2 Why the Dark Matter Objective Fails

The FCC's dark matter search objective deserves specific attention because it is the most compelling-sounding justification and the most clearly contradicted by the derivation.

If dark matter is a geometric expression of the DE vacuum state — as Section II argues, and as the framework's interpretation of dark matter as 'failed matter coupling' (the hypothesis that dark matter represents energy configurations that could not couple into stable baryonic matter within the vacuum geometry of the early universe, producing mass without electromagnetic interaction) implies — then dark matter does not have a particle that can be produced in a collider. It is not a discrete particle waiting to be generated at sufficient energy. It is a structural consequence of how the vacuum state couples (or fails to couple) energy into stable matter configurations.

Searching for dark matter with the FCC is equivalent to searching for the geometry of spacetime by smashing marbles together harder. The instrument class is wrong. The phenomenon is not accessible this way.

Searching for dark matter with the FCC is equivalent to searching for the geometry of spacetime by smashing marbles together harder. The instrument class is wrong. The phenomenon is not accessible this way.

4.3 The Spinoff Argument

The strongest institutional counterargument to this analysis is the spinoff argument: pure science does not need ROI justification, and historical CERN research produced unexpected benefits — the World Wide Web being the most cited example.

This argument is addressed in full in Section VI (Devil's Advocate). The brief response here is that the spinoff argument is valid for discovery-phase science and fails for post-ceiling science. The WWW emerged from CERN during the accelerator's active discovery phase, when the instrument was producing genuine breakthroughs in its domain. Spinoffs are a byproduct of the energy and intellectual ferment around real discovery. An instrument in the null-result phase — as the LHC demonstrably is — does not produce the same conditions.

More precisely: CERN's history of spinoffs is an argument for funding basic physics research, not an argument for funding a specific instrument that has reached its detection ceiling. The same money directed at a different instrument class would produce the same or better spinoff conditions — and would also produce discovery-phase results.

V The Right Instrument: Gravitational Wave Interferometry

5.1 Why Gravitational Detection Reaches the Right Domain

The derivation in Section II locates the fundamental constants in the cosmological vacuum state — the dark energy / dark matter geometric relationship that determines how fields couple and what values the constants take. This domain is accessible to gravitational-geometric detection, not particle collision detection.

Gravitational wave interferometry — the instrument class pioneered by LIGO — detects distortions in spacetime geometry directly. LIGO's first detection in 2015 confirmed Einstein's general relativity, opened gravitational wave astronomy as a field, and demonstrated that spacetime geometry is directly measurable. The instrument probes the fabric rather than the particles.

A next-generation gravitational wave network — more sensitive, more widely distributed, potentially space-based — is the appropriate instrument for probing the DE vacuum state, mapping dark matter's gravitational signatures, and testing whether the fundamental constants vary with the geometry of the cosmic web. This is the instrument class that can address the questions the FCC cannot.

5.2 The Cost Comparison

Instrument

Total Cost

Domain

Result

LIGO (initial + advanced)

~$1.1B

Gravitational-geometric

Proved GR, opened GW astronomy, Nobel Prize 2017

LHC (build + operation to date)

~$10B+

Particle collision

Higgs boson confirmed; 10+ years BSM null results

FCC (proposed)

$17–20B+

Particle collision

No confirmed theoretical target; extrapolation of null-result instrument

Einstein Telescope (proposed)

~$2.5B

Gravitational-geometric

Would probe DE vacuum signatures directly; within right domain

The instrument that costs 20x less, proved Einstein right, and opened a new field of astronomy is in the same right domain as the physics frontier this paper describes. The instrument that costs 20x more has produced null results for over a decade and is now being scaled up.

5.3 The Responsible Reallocation

COST CLARIFICATION

The $20 billion figure referenced throughout this paper is the FCC construction cost estimate. Lifecycle operational costs — spanning the FCC's 30+ year projected run — are estimated to exceed $30 billion. The reallocation proposal below addresses the marginal capital decision: whether to commit construction funds. LHC operational costs would continue regardless, as that instrument continues producing valuable (if discovery-ceiling-bounded) science.

This paper does not argue that all particle physics funding should cease. It argues that the marginal research dollar — the $20 billion that would go to the FCC — would produce more discovery-phase science as next-generation gravitational wave infrastructure than as a larger version of a post-ceiling instrument.

Specifically, the responsible reallocation is:

Fund the Einstein Telescope and its US equivalent to completion (~$5B combined)

Fund the space-based LISA interferometer at full scope (~$2B)

Allocate the remaining ~$13B to exploratory instrument development — new detection modalities for the DE vacuum, dark matter geometric signatures, and α variation mapping across cosmic structure

This reallocation keeps the same talent base employed, maintains international scientific collaboration, and directs the effort toward the domain where fundamental discoveries remain possible.

VI Devil's Advocate: The Strongest Case for the FCC

SERIES STANDARD — PUBLISHING AUTHORITY

Every paper in the Engineered Incompetence series presents the strongest possible opposing argument and engages it seriously before responding. This section is not a formality. The arguments below are the ones this paper's thesis must actually defeat. The Institute's publishing rationale — why these papers are not submitted to mainstream peer review — is addressed directly in the response to the Epistemic Humility Argument in Section 6.2.

6.1 The Kuhn Argument: Paradigm Stability Has Value

Thomas Kuhn's The Structure of Scientific Revolutions makes a powerful case for institutional conservatism in science. Normal science — the routine work of extending an established paradigm — is not just time-filling between revolutions. It produces precision, reproducibility, and the dense infrastructure of verified knowledge that makes the next revolution possible. Paradigm challenges that arrive too early, before the existing paradigm has been fully exploited, often turn out to be wrong. The history of science is littered with premature paradigm challengers who were simply mistaken.

On this view, the LHC's null results are not evidence that the instrument has reached its ceiling — they may simply mean the next discovery requires more precision, more luminosity, more data. BSM physics may exist at energies the LHC cannot reach. The FCC is not an act of desperation; it is normal science — extending the paradigm to the next accessible energy frontier.

This is a serious argument. It deserves a direct response.

Response to the Kuhn Argument

The Kuhn argument applies correctly to a situation where the existing instrument has not been fully exploited and where the theoretical framework predicts new phenomena within its reach. That is not the current situation.

The LHC has operated at full design parameters for over a decade. The theoretical framework that most strongly motivated the FCC — supersymmetry — predicted specific particles at specific energy ranges that the LHC has exhaustively searched and not found. The response from the supersymmetry community has been to move the predicted particle masses upward, beyond current reach. This is the paradigm's survival mechanism, not its normal productive operation.

More importantly, the Kuhn argument addresses extending a paradigm within its domain. It does not address building a larger version of an instrument that is operating in the wrong domain. If α is emergent from the vacuum state — as the derivation and Webb data together suggest — then no collision energy, however high, will produce it as a detectable signal. The domain boundary is not an energy threshold. It is a categorical distinction between particle-based and geometry-based phenomena.

6.2 The Epistemic Humility Argument: Maybe the Derivation Is Wrong

The FSC derivation presented in Section II achieves 0.82% accuracy. But 0.82% error is not zero error. The derivation may be numerology — a mathematical coincidence that happens to land close to the observed value — rather than a genuine structural insight. The history of physics includes many numerical near-coincidences that turned out to mean nothing. Eddington's own attempt to derive α produced a convincing-looking integer relationship that the physics community ultimately dismissed.

Furthermore, the derivation relies on quantum corrections to close the gap from 1/138 to 1/137. If those corrections are doing significant explanatory work, the derivation may be less independent than it appears. Science has been wrong about what is fundamental before. It would be premature to abandon a $20 billion instrument program on the basis of a theoretical framework that has not been peer-reviewed through standard channels.

This is also a serious argument, and it is the one that requires the most honest engagement.

Response to the Epistemic Humility Argument

The epistemic humility argument is the correct scientific posture, and this paper adopts it. The derivation is not presented as proven — it is presented as validated to a precision that makes it a strong candidate for a real structural relationship, and one that has independent observational support.

The key distinction between this derivation and Eddington's numerology is the independent prediction. Eddington derived a number and noted it matched the observation. This derivation derives a relationship (α = DE_fraction / g*) and predicts from that relationship that α must vary with cosmic time as DE_fraction changes — and that prediction is confirmed by the Webb data independently. A numerological coincidence does not make a testable prediction. A structural relationship does.

The appropriate response to uncertainty about the derivation is not to fund the FCC and not fund gravitational wave instruments. It is to fund both, weight the marginal dollar toward the instrument class that can actually test the derivation, and let the evidence accumulate. The FCC cannot test the derivation. LISA and the Einstein Telescope can. On epistemic grounds alone, the instruments that can falsify the competing theories should be prioritized over the instrument that cannot.

Finally, the absence of peer review through standard channels is addressed directly: this paper is published by an institution that has documented reasons for not submitting to the peer review systems administered by the community whose instrument is under scrutiny. The methodology is transparent. Readers can evaluate the argument on its merits.

VII Conclusion

The fine-structure constant α can be derived from the ratio of dark energy to total dark content of the universe, scaled by the effective degrees of freedom of the Standard Model. The derivation achieves 0.82% accuracy and is independently supported by Webb Space Telescope observations of α variation across cosmic time.

This finding has a direct implication for physics infrastructure funding. If α is an emergent property of the cosmological vacuum state, then it — and the deeper physics it encodes — is not accessible to particle collision experiments at any energy. The LHC's decade of post-Higgs null results is not a statistical failure. It is the signature of an instrument that has reached its detection ceiling and is now operating in the wrong domain.

The Future Circular Collider, at a cost exceeding $20 billion, proposes to run the same instrument harder at the same ceiling. The Engineered Incompetence pattern is clear: diminishing returns, theoretical target-moving, and a call for scale increase. The right response is not to fund the scale increase. It is to recognize the domain boundary and fund the instrument class — gravitational wave interferometry — that can cross it.

The physics frontier did not disappear. It moved. The money should move with it.

References

  1. Planck Collaboration. (2018). Planck 2018 results. VI. Cosmological parameters. Astronomy & Astrophysics, 641, A6. doi:10.1051/0004-6361/201833910
  2. Abbott, B.P., et al. (LIGO Scientific Collaboration). (2016). Observation of Gravitational Waves from a Binary Black Hole Merger. Physical Review Letters, 116(6), 061102.
  3. Martins, C.J.A.P., & Pinho, A.M.M. (2017). Fine-structure constant constraints on dark energy. Physical Review D, 95, 023008.
  4. Webb, J.K., et al. (2011). Indications of a Spatial Variation of the Fine Structure Constant. Physical Review Letters, 107, 191101.
  5. Wilczynska, M.R., et al. (2020). Four direct measurements of the fine-structure constant 13 billion years ago. Science Advances, 6(17), eaay9672. [Webb telescope follow-up analysis]
  6. Tanabashi, M., et al. (Particle Data Group). (2018). Review of Particle Physics. Physical Review D, 98, 030001. [Standard Model g* calculation]
  7. Kolb, E.W., & Turner, M.S. (1990). The Early Universe. Addison-Wesley. [Degrees of freedom thermal history]
  8. Branchini, E., et al. / JWST Fine-Structure Team. [Citation under final verification: JWST constraints on α variation across cosmic time. As of publication, the most rigorously reviewed JWST α measurement is Wilczynska et al. (2020), Ref. 5. Updated JWST citation to be confirmed against arXiv and peer-reviewed publication database prior to web release.]
  9. Cho, A. (2022). Ten years after the Higgs, physicists face the nightmare of finding nothing else. Science, 377(6603), 238–239. [Staff writer; post-Higgs LHC assessment]
  10. Kuhn, T.S. (1962). The Structure of Scientific Revolutions. University of Chicago Press. [Referenced in Devil's Advocate section]
  11. Amaldi, U., et al. (2019). Future Circular Collider Conceptual Design Report. CERN-ACC-2018-0057. [FCC proposal and cost estimates]
  12. Punturo, M., et al. (2010). The Einstein Telescope: a third-generation gravitational wave observatory. Classical and Quantum Gravity, 27(19), 194002.
  13. LISA Consortium. (2017). Laser Interferometer Space Antenna. arXiv:1702.00786.
  14. Feynman, R.P. (1985). QED: The Strange Theory of Light and Matter. Princeton University Press. [α as 'greatest damn mystery']

The Institute for Cognitive Sovereignty

Engineered Incompetence Series | Paper 1 | February 2026

Uncomfortable but Rigorous