Iris Long
Underground Intelligence
In this underground world, colossal technological objects are beginning to construct a cosmos of their own. A subterranean realm known as the “fourth territory” houses many of the world’s largest underground laboratories, most dedicated to dark matter searches, earth sciences, and radioactive waste containment—a parallel universe of ceaseless computation beneath the earth’s surface.
Since the Future Circular Collider (FCC) project in Europe was suspended due to budgetary issues (the budget had already exceeded its original $10 billion estimate 1.5 times) and the fact that differing tech policies and the increasingly fragmented state of Europe make it difficult to reach a consensus on investments in fundamental science, only a handful of global economies can still afford to construct and operate underground scientific mega-experiments, like accelerators and neutrino detectors. With dwindling funding for fundamental research, some ultrasensitive screening detectors from previous generations of scientific infrastructure have been repurposed for geological surveys, microbiology, environmental science, and even national defense; others have been opened as tourist attractions. The High-Luminosity Large Hadron Collider (HL-LHC), operational since 2029, has triggered a “computational crisis” as its integrated luminosity—proportional to the number of particle collisions over a given time—has increased to ten times the design value of the LHC, surpassing the existing computational capacity currently available within the EU. That is to say, even if Europe were to reach a consensus on building the next-generation accelerator, the computing infrastructure within Europe would still be unable to support the operations of such equipment. However, in China, a surplus of computational infrastructure built during the last cycle of AI development now powers data processing for underground experiments.
The United States has renewed federal funding for underground experiments, largely driven by the successful application of quantum computing in accelerator and particle physics, including detector operation algorithms that enable rapid cross-validation of particle simulations on quantum computers and experiments on accelerators. China, meanwhile, has completed construction of the CEPC, the world’s largest circular collider, buried over a hundred meters underground. Known as another “Higgs Factory” (if the FCC had been built, it would also have been one), this collider was ultimately positioned at nodes overlapping with China’s newly built Future Science Network (CNEI), maximizing the computational power of this network. Notably, special optical fibers designed for quantum transmission, mass-produced five years ago, have been extensively deployed in this network. If we were to create a heat map of high-energy equipment and mega scientific facilities underground, we would see intense hotspots in China and the United States, with the rest of the world relatively mild or even cold. Consequently, discoveries about cosmic origins, dark matter, and the subatomic world can emerge only from countries with the requisite geography, energy, technology, and funding.
Because fundamental physics has a long return cycle, Silicon Valley’s latest algorithmic models show little interest in “backend” fields like foundational physics, leaving a substantial amount of older-generation algorithms in use within scientific facilities. The latest AI models have learned to construct predictive vector spaces from minimal data samples, whereas the models in particle physics still rely on vast data sets typical of previous generations. These “small data” models cannot yet provide coherent explanations for particle physics.
Meanwhile, these underground laboratories generate immense amounts of data daily. Unlike the previous generation of accelerators, this generation is set to produce, over a decade-long experimental plan, over one million Higgs bosons as well as one hundred million W bosons and nearly a trillion Z bosons, a process significantly aided by deep learning. Data gathered in particle accelerators and similar devices is characterized by its ultra-high-dimensional feature, as each detector contains millions of sensor elements and their signals must be combined to produce meaningful results. Algorithms have recently developed the capacity to estimate particle energy levels and positions based on low-order materials, assisting in rapid formula validation. Although the opacity of deep learning still renders its conclusions unreliable in the eyes of many physicists, these algorithms are forming an intelligence entirely different from human thought. The data type and scale used to nurture it bear no resemblance to anything human. It studies particles, starting from the most foundational blocks, to comprehend the universe’s essence; it has no concept of “human,” yet it has cultivated unique insights into the underlying laws that structure the cosmos, surpassing anything humans have realized. It not only attempts to verify string theory, supersymmetry models, or the Standard Model and its many branches, but it has also begun to propose particle models of its own—for the first time in history, we are witnessing a reversal in fundamental science.
Our pace of proposing physical models can no longer match the speed at which machines generate them. Here, an interesting inversion has occurred: in the past, the construction of large-scale technological apparatuses was aimed at repeatedly verifying already proposed theoretical models. Now, these machines are generating their own theoretical models. The final frontier of technological geopolitics or its sublime is not about victories in politics or commerce but rather the clash of methods to interpret the universe. Each newly proposed fundamental model holds the potential to be transformed into energy development, economic applications, and even national defense capacities. It is worth noting that elementary particles themselves do not constitute a resource that can be restricted in its flow. China and the US are essentially on equal footing when it comes to algorithms, computational infrastructure, and the energy and labor production conditions that support these facilities. These large-scale fundamental science facilities autonomously and equitably generate explanatory models of the universe. However, how these models will be applied has become a wager on the future.
Another unsettling phenomenon is also emerging: many of these new models of the universe appear to be, to some extent, explainable. In other words, these underground facilities can not only precisely control the birth of particles but also reveal more than one way to explain them—suggesting that a “theory of everything” might not exist. Perhaps it never has.
Iris Long is a writer and independent curator based between London and Beijing.
Yingjing Xu
Donkey-Like Martian Rovers Are About to Enter Service!