AI Crash Report, Episode 2: The Physics of the Collapse
Even If the Money Were Real, the Physics Don’t Work
February 24, 2026🎬 WATCH THE FULL EPISODE:
[▶ AI CRASH REPORT: THE PHYSICS OF THE COLLAPSE — Watch on YouTube]
In Episode 1 of the AI Crash Report, I showed you the financial architecture of the AI bubble—money moving in circles, investment capital cycling through cloud contracts and chip purchases to manufacture the appearance of explosive demand.
But here’s the thing that keeps me up at night: even if every dollar were real, even if trillions in genuine customer demand were waiting to be served, the AI industry still couldn’t deliver what it’s promising.
That’s the physics problem. And it’s insurmountable.
I’ve spent 35 years in Hollywood and on Madison Avenue learning how to manufacture perception. I know what it looks like when the story being told no longer matches the facts on the ground. The AI industry is telling a story about unlimited potential and inevitable transformation. The physics tell a different story entirely—one of hard constraints, impossible timelines, and bets on science fiction.
Act 1: The Energy Wall
Let me give you a sense of scale.
The proposed Stargate project—Microsoft and OpenAI’s distributed network of data center campuses meant to power the next generation of AI—requires 10 gigawatts of electricity. That’s the output of 10 nuclear power plants. The power needs of a mid-sized country. As Patrick Boyle points out, that’s equivalent to the electricity consumption of roughly 26 million European homes, or 8 to 10 million American homes. One project.
And let’s pause on the branding for a moment. Stargate. They’re making a warehouse full of overheating silicon sound like a multi-dimensional portal. Can you see what I’m talking about? The linguistic mirage engineering never stops.
Now, pop quiz: how many nuclear power plants have been built in the United States in the last 30 years?
Exactly one. Plant Vogtle in Georgia took over a decade to complete and went billions over budget. One plant in 30 years. The AI industry needs 10 for this single project, and they need them now, not a decade from now.
But it’s not just about generating the power—it’s about getting it where it’s needed. The grid interconnection queue in the United States, the waiting list to connect new facilities to the electrical grid, currently exceeds five years. Even if you had the power generation capacity, you couldn’t plug it in.
Ed Zitron drives this home: Oracle signed a deal to build 4.5 gigawatts of data center capacity for OpenAI, and meanwhile, Nvidia has shipped enough GPUs to require between 6 and 12 gigawatts of computing load. But the physical infrastructure to install and power those chips simply doesn’t exist.
Some companies are attempting workarounds. Patrick Boyle reports that Elon Musk’s xAI data center in Memphis is running gas turbines without proper permits, degrading local air quality and triggering formal EPA violations. When companies start violating environmental regulations to power their operations, it tells you something about how desperate the situation has become.
Act 2: The Depreciation Trap
The energy problem is compounded by something even more fundamental: what these data centers are actually made of.
Traditional data centers are mostly buildings and land—real estate. Banks understand how to finance real estate over 30 years. The asset holds value. AI data centers are different. According to Patrick Boyle, approximately 70% of their cost is chips. GPUs. And those chips wear out physically over 7 to 10 years, but become obsolete much sooner—usually in about three years, six years at the outside.
This creates what Boyle calls the financing problem. Banks will not accept rapidly depreciating GPUs as collateral for long-term loans. The asset will be worthless before the loan is repaid.
Let that land. The core infrastructure of the so-called AI revolution can’t be financed through normal banking channels because the banks understand the assets are essentially disposable.
Ed Zitron draws a critical contrast with the dot-com bubble. When that bubble burst, the overbuilt infrastructure was fiber optic cable with a 15- to 24-year lifespan. Much of the dark fiber from the 1990s eventually lit up and became useful. GPUs don’t work that way. They’re consumables, not infrastructure. When the AI bubble bursts, the data centers won’t sit idle waiting for demand to catch up. They’ll be filled with obsolete chips that need to be replaced.
And here’s the number that should stop you cold: Ed Zitron predicts that at least 25% and possibly 50% or more of AI data centers will go unused. His reasoning? Excluding the hyperscalers who rent to themselves, the actual market for renting GPU capacity is less than $1 billion. But companies signed $178 billion in data center deals in the US alone in 2025. The gap between what’s being built and what the market can absorb isn’t a rounding error. It’s an order of magnitude.
Act 3: What Is AI, Really?
There’s a reason the AI industry doesn’t like to talk about physics. They prefer to talk about magic—or at least what sounds like magic. You’ve heard the language: AI that thinks, AI that understands, AI that might become conscious. The implication is that we’re building something that transcends normal technological limits, something that justifies any level of investment because the payoff is infinite.
Cal Newport, an MIT PhD computer scientist, cuts through this mysticism with precision. Here’s what a large language model actually is: a vast table of numbers—parameters—that defines a sequence of mathematical operations. Once training is complete, those numbers are static. They do not change, learn, or update while the model is being used. The operation is matrix multiplication. Input numbers go in, get multiplied through layers of parameters, output numbers come out.
That’s it. It’s math. A lot more than nothing, but not magic.
Newport is clear about what this architecture cannot do. The model cannot have spontaneous insights. It cannot have drives or wants. It does not have a world model—an internal representation of how reality works that it can use to plan and reason. It predicts the next token based on patterns in its training data. That prediction can be remarkably useful. It can even appear intelligent. But it’s not thinking in any meaningful sense.
This matters because the entire business case for AI depends on it becoming something more than a sophisticated autocomplete. Newport reports that efforts in 2025 to build agentic AI—programs that can plan and execute complex tasks autonomously—largely failed. The models lack the world models and planning capabilities required to operate reliably in open-ended environments.
This is crucial. The agents were supposed to be the breakthrough that justified the trillion-dollar valuations. Agents that could replace knowledge workers. Agents that could automate entire industries. The agents didn’t work anywhere near that level.
Watch for the pattern: when pre-training gains stalled, the industry pivoted to post-training efficiency. When that didn’t deliver, they pivoted to reasoning models. When reasoning models underperformed, they pivoted to agents. Now that agents have disappointed, they’ll pivot to something else. Each pivot is a retreat disguised as an advance. The destination keeps receding because the fundamental technology—matrix multiplication on static parameters—cannot do what’s being promised.
Act 4: The Fusion Fantasy
So how does the AI industry plan to solve the energy problem? They’re betting on nuclear fusion. Of course they are.
Sam Altman has personally invested $375 million in Helion Energy, a fusion startup, and serves as its chairman. Microsoft has signed a power purchase agreement to buy fusion-generated electricity from Helion starting in 2028.
Let me be clear about what this means: they’re planning to power data centers with a technology that is nowhere near commercial net energy gain—within three years. Nuclear fusion has been “20 years away” for the last 60 years. I grew up near the Lawrence Livermore Lab and remember hearing this promise when I was in elementary school.
It was only recently that, for the first time, the energy yielded by a fusion reaction briefly exceeded the energy required to produce it. Exciting, but even the most optimistic projections from fusion researchers don’t anticipate commercial-scale power generation for at least a decade, assuming breakthroughs that haven’t happened yet.
Patrick Boyle puts it bluntly: relying on fusion to power data centers by 2028 is not a plan. It’s science fiction being used as a business strategy.
Here’s what the fusion bet actually accomplishes: pointing to a miraculous future solution helps executives avoid answering hard questions about how their current business model is incompatible with present energy realities. How will you power the data centers? Fusion. When will fusion be ready? Soon. What if it’s not?
The fusion fantasy isn’t a plan. It’s a deflection—or worse, a call for blind faith.
The Synthesis: Why This Matters for You
Here’s where all of this converges:
The AI industry needs 10 gigawatts of power for a single major project. The US has built one nuclear power plant in 30 years. Grid interconnection takes five years. The chips filling these data centers lose 60–70% of their value in three to six years—faster than the loans can be repaid. Banks won’t finance them. The technology itself is matrix multiplication, not magic, and the breakthrough agents that were supposed to justify everything failed to materialize. The response to all of this amounts to magical thinking: a bet on nuclear fusion by 2028.
In Episode 1, I showed you the financial structure—money moving in circles. In this episode, I’ve shown you the physical constraints—infrastructure that can’t be built on time, assets that depreciate too fast, technology that can’t deliver what’s promised.
The question becomes: when do these constraints become undeniable? When does the gap between promise and reality force a reckoning?
In the final episode of the AI Crash Report, I’ll examine the crash scenarios—what triggers the collapse, what the timeline might look like, and what the consequences could be for anyone invested in the broader market.
🎬 Watch the full episode with all source material linked in the description:
[▶ AI CRASH REPORT: THE PHYSICS OF THE COLLAPSE — Watch on YouTube]
I’m Julian Whatley. And now you see it.
If you found this analysis valuable, share it with someone who needs to see the physics behind the promises. Subscribe for Episode 3: the crash scenarios.



