Logo
FrontierNews.ai

The AI Energy Crunch Is Forcing Tech Giants to Rethink Everything: Here's What Five Industry Leaders Say

The artificial intelligence boom is running into a wall, and it's not just about chips anymore. At the Milken Global Conference in Beverly Hills, five architects of the AI economy revealed that the industry faces cascading bottlenecks far deeper than most realize. Chip shortages will persist for the next two to five years, energy constraints are forcing companies to explore orbital data centers, and some researchers are questioning whether the foundational technology stack itself is sustainable.

Why Is the AI Industry Suddenly Hitting Physical Limits?

The problem starts with manufacturing. Christophe Fouquet, CEO of ASML, the Dutch company that produces the extreme ultraviolet lithography machines essential for modern chip production, stated plainly that despite massive acceleration in chip manufacturing capacity, "for the next two, three, maybe five years, the market will be supply limited." This means hyperscalers like Google, Microsoft, Amazon, and Meta simply won't receive all the chips they've ordered, no matter how much they're willing to pay.

The demand is staggering. Francis deSouza, Chief Operating Officer of Google Cloud, revealed that Google Cloud's revenue crossed $20 billion last quarter with 63% growth, while its backlog of committed but undelivered revenue nearly doubled in a single quarter, jumping from $250 billion to $460 billion. That gap between what companies want and what they can actually get is the new reality of the AI economy.

But chip scarcity is only the first constraint. Energy consumption is the bottleneck looming behind it. AI data centers alone accounted for 27 gigawatts, or 43%, of total corporate power procurement in 2025, and aggregate U.S. data center capital spending is projected to approach $500 billion in 2026. This explosive growth is testing grid limits and forcing tech giants to pursue clean, baseload power solutions aggressively.

How Are Companies Responding to Energy Constraints?

DeSouza confirmed that Google is exploring data centers in space as a serious response to energy constraints. "You get access to more abundant energy," he noted, though the engineering challenges are formidable. Space is a vacuum, which eliminates convection cooling, leaving radiation as the only way to shed heat into the surrounding environment. This is a much slower and harder-to-engineer process than the air and liquid cooling systems that terrestrial data centers rely on today.

Google's broader strategy centers on efficiency through integration. By co-engineering its full AI stack, from custom TPU chips through to models and agents, the company achieves significantly better energy efficiency than competitors buying off-the-shelf components. "Running Gemini on TPUs is much more energy efficient than any other configuration," deSouza explained, because chip designers know what's coming in the model before it ships.

The clean energy sector is responding to this demand surge. Global investment in clean energy is projected to reach $3.8 trillion by 2030, driven by three converging forces: AI power demand, electrification of transportation, and expansion of sustainable aviation fuel capacity. However, a critical bottleneck remains: decades of underinvestment have left transmission and distribution networks strained, creating a key constraint on the pace and scale of the clean energy transition.

Ways to Understand the Emerging Bottlenecks in AI Infrastructure

  • Chip Supply Constraints: Despite massive manufacturing acceleration, the market will remain supply-limited for two to five years, meaning demand far exceeds available capacity and will continue to do so.
  • Energy Demands Outpacing Grid Capacity: AI data centers consumed 43% of total corporate power procurement in 2025, and grid modernization has not kept pace with explosive growth in data center deployment.
  • Cooling and Heat Dissipation Challenges: Traditional air and liquid cooling systems are reaching their limits, forcing exploration of unconventional solutions like orbital data centers where radiation is the only cooling mechanism.
  • Real-World Data Collection Bottlenecks: For physical AI systems like autonomous vehicles and defense equipment, the constraint isn't computing power but real-world data that can only be gathered by deploying machines and observing what happens.

Qasar Younis, co-founder and CEO of Applied Intuition, a $15 billion physical AI company, highlighted a different bottleneck altogether. Applied Intuition builds autonomy systems for cars, trucks, drones, mining equipment, and defense vehicles. For these applications, silicon isn't the constraint; it's data. "You have to find it from the real world," Younis explained, noting that "there will be a long time before you can fully train models that run on the physical world synthetically".

Is the Current AI Architecture Fundamentally Flawed?

While the rest of the industry debates scale and inference efficiency within the large language model paradigm, some researchers are questioning whether the entire approach is sustainable. Eve Bodnia, a quantum physicist who left academia to challenge foundational AI architecture at her startup Logical Intelligence, is building something fundamentally different.

Her company uses energy-based models, a class of AI that doesn't predict the next token in a sequence but instead attempts to understand the rules underlying data, in a way she argues is closer to how the human brain actually works. "Language is a user interface between my brain and yours," Bodnia said. "The reasoning itself is not attached to any language." Her largest model runs to 200 million parameters, compared to the hundreds of billions in leading large language models, and she claims it runs thousands of times faster while being designed to update its knowledge as data changes rather than requiring retraining from scratch.

For domains like chip design and robotics where a system needs to grasp physical rules rather than linguistic patterns, Bodnia argues energy-based models are the more natural fit. "When you drive a car, you're not searching for patterns in any language. You look around you, understand the rules about the world around you, and make a decision," she explained. This perspective is likely to attract more attention as the AI field begins asking whether scale alone is sufficient.

Bodnia

Christophe Fouquet offered a sobering observation about the economics of this moment: "Nothing can be priceless." The industry is investing extraordinary amounts of capital, driven by strategic necessity, but more compute means more energy, and more energy has a price. The question is whether the current trajectory is economically and physically sustainable.

Christophe Fouquet

The convergence of these constraints, supply limitations, energy demands, and architectural questions suggests the AI economy is entering a new phase. The era of unlimited scaling may be ending, forcing companies to choose between efficiency, sustainability, and raw capability. How the industry responds to these hard physical limits will shape the next decade of artificial intelligence development.