The AI Bubble Isn’t What You Think (It’s Not About Hype, It’s About Physics)

By Jeremy Benjamin | 2nd December 2025 | Quantum, Data Centre, AI, Bubble

There has been a lot of discussion lately about whether we are in an “AI Bubble”.

We all feel it. Technology is advancing at a velocity that feels biologically unsustainable. We are building systems of such complexity that “fragility” has become a feature, not a bug. Just look at the recent headlines: one day, a transformer fire at a substation takes down Heathrow; the next, a configuration error ripples through AWS, taking half the internet with it. These are the cracks in the dam.

Whenever the term “AI Bubble” is batted around, people usually picture a Dot-com style crash—a moment where users simply stop caring, and the hype evaporates. But I don’t believe that’s where the danger lies.

I am an optimist: AI is ultimately a force for good. Governments and bad actors will abuse it—that is human nature—but the utility is undeniable. The bubble won’t burst because people stop using AI. It will burst because the economics of building it are about to hit a wall.

The bubble isn’t in the software. It’s in the concrete, the silicon, and the balance sheets. Bear with me—this isn’t as sexy as a stock market crash, but it’s far more critical.

The CapEx Cliff

Traditionally, infrastructure is a long game. You borrow vast sums of Capital Expenditure (CapEx) to build assets, and those assets pay you back over decades.

  • Civil Infrastructure (Bridges, Railways): These are built to last 50 to 100 years.
  • Data Centre Shells: The physical buildings and power grids are typically depreciated over 20 to 30 years.

The deal with the banks is simple: the asset stays useful long enough to pay off the loan and generate a profit.

Here is the problem: AI is moving so fast that it is breaking this fundamental economic law. The infrastructure required to deploy AI—specifically the GPUs and TPUs inside those data centres—is ageing in dog years. While a building lasts decades, state-of-the-art AI chips are now effectively obsolete in 3 to 4 years.

We are pouring billions into infrastructure that depreciates faster than we can possibly monetise it. We are building cathedrals to house technology that will be junk before the paint is dry.

The Quantum Obsolescence

This brings us to the “cooling” trap. Modern AI chips run hot—sun-surface hot. Data centre designs have pivoted aggressively toward liquid cooling and massive HVAC systems to keep these silicon brains from melting.

But just as we master this, the next horizon is already visible: Quantum Computing.

I believe AI is accelerating the arrival of Quantum. But here is the kicker: Quantum computers don’t just need to be “cool”; they often need to be near absolute zero. The infrastructure required for cryogenics is fundamentally different from the warm-water cooling loops used for today’s AI.

As soon as our thirst for Generative AI is quenched, we will pivot to the exponential power of Quantum. The result? Vast fields of “AI-ready” data centres could become stranded assets—ultra-expensive facilities that are physically incapable of hosting the next generation of hardware without being gutted.

The Fix: Adaptability or Bust

How do we stop the bubble from bursting? We have to stop building static monuments and start building adaptive organisms.

Data centres need to be designed for transformation, not just optimisation. This requires a level of collaboration that currently doesn’t exist. We need the hyperscalers—Google, Microsoft, Amazon, Meta—to sit at the table with university researchers, chill-chain designers, and construction firms to agree on standards that allow facilities to evolve.

Right now, the secrecy required to protect Intellectual Property is strangling this collaboration. Everyone is building their own walled garden, unaware that the ground beneath them is shifting. If we don’t start sharing the blueprints for the future, we’re going to be left with a lot of very expensive, very empty buildings

Scroll to Top