The Space Data Center Race Has a Critical Blind Spot: Nobody's Talking About Power
The orbital data center boom is missing its most essential ingredient: a reliable power source. While Elon Musk predicts space will become the cheapest place to run artificial intelligence (AI) within 36 months and companies like LoneStar announce plans for lunar data centers, the industry has largely ignored a fundamental engineering challenge. Nobody is seriously discussing where the electricity will come from to power these ambitious projects .
This oversight mirrors a crisis already unfolding on Earth. Data center power consumption is becoming the primary bottleneck limiting AI infrastructure growth. The International Energy Agency estimates that data center power consumption could double by 2030, reaching as much as 945 terawatt-hours annually, or roughly equivalent to the entire electrical grid of Japan . Meanwhile, McKinsey forecasts that $6.7 trillion in data center investments will be required by 2030 to support AI buildout, with hyperscalers like Amazon, Microsoft, Alphabet, and Meta spending approximately $410 billion on data center infrastructure in 2025 alone .
Why Is Power the Real Limiting Factor for AI Data Centers?
The energy crisis is no longer theoretical. In 2026, the four largest tech companies are expected to spend nearly $650 billion to $700 billion on data center infrastructure, a 60 percent increase in a single year . Turbines are sold out through 2030, and power purchase agreements for new nuclear capacity are being signed in blocks of hundreds of megawatts. The terrestrial electrical grid simply cannot keep up with demand .
This reality has forced hyperscalers to take matters into their own hands. Google recently made a major acquisition of Intersect Power, a renewable energy developer, to scale massive solar and storage portfolios. Microsoft and other tech giants are signing multi-gigawatt nuclear power purchase agreements directly with energy providers. The message is clear: waiting for the grid to expand is no longer an option .
For orbital data centers, the energy problem becomes even more complex. Space presents unique advantages for power generation that Earth-based facilities cannot match. A solar panel in orbit produces roughly five times the amount of electricity compared to the same panel on Earth, with no atmosphere, weather, or day-night cycle to interfere with energy production . There is no interconnection queue and no permitting delays. For a data center operator, this represents a fundamental economic advantage that completely changes the financial equation.
What Energy Technologies Are Being Developed for Space?
The good news is that the energy infrastructure needed for orbital data centers is already being developed, primarily for terrestrial clients. These technologies will eventually be adapted for space deployment. Three key energy solutions are emerging:
- Advanced Solar Cells: Silicon solar cells are approaching their physical efficiency ceiling at 29 to 30 percent. Perovskite-silicon tandem cells represent the next generation, with companies like Tandem PV working to bridge the gap between record-breaking efficiency in laboratory settings and real-world reliability on Earth, creating the foundation for space qualification .
- Nuclear Radioisotope Power Systems: For environments where continuous solar power is impractical, such as lunar polar craters or high-inclination orbits, nuclear radioisotope power systems are not one option among many; they are the only viable option. Zeno Power is already proving the concept in seabed and lunar environments, with both commercial demand and NASA programs driving hardware advancements .
- Fusion Propulsion: The economic case for orbital infrastructure depends heavily on the cost of launching mass into space. Fusion propulsion has the potential to be far more efficient than chemical rockets, with specific impulse values orders of magnitude better than current technology. NASA and DARPA are already investing in fusion propulsion programs, and the same companies developing fusion reactors for Earth will build the orbital drives .
The practical implication is significant: reducing launch costs through fusion propulsion could make it far more economically feasible to build infrastructure in space, including data centers .
How Should Investors and Companies Prepare for the Orbital Power Challenge?
The space industry's default approach treats energy as a downstream problem, something to address after the architecture is established. This sequencing is already causing delays in terrestrial data center deployment for the same reason. Fixing this requires deliberate action in two specific areas :
- For Investors: Examine your space portfolio for energy exposure, not as an environmental, social, and governance (ESG) line item, but as an operational dependency. The critical question is simple: if grid access disappears, which of your portfolio companies stop working? Energy infrastructure should be treated as a foundational position, not an opportunistic add-on. Companies solving next-generation power on Earth, such as advanced solar manufacturers and radioisotope power developers, are the same ones that will power the orbital economy .
- For Hyperscalers: Microsoft, Google, and Amazon already have the procurement infrastructure, legal templates, and internal mandates to sign multi-gigawatt nuclear power purchase agreements on Earth. The missing step is applying that same muscle to orbital energy systems. Concretely, hyperscalers should open a dedicated request for proposal (RFP) track for space power suppliers alongside existing data center energy procurement. Bringing radioisotope power companies, advanced solar manufacturers, and power management specialists into conversations currently limited to launch providers and satellite operators would immediately change the equation. Including a power density requirement in any orbital infrastructure RFP, equivalent to what ASHRAE standards do for terrestrial data centers, would bring the right companies to the table .
The budget exists. What is missing is the internal mandate to treat orbital power as a 2026 problem rather than a 2032 one .
Will Musk's 36-Month Timeline Actually Happen?
Elon Musk's prediction that space will become the cheapest place to run AI within 36 months will likely be proven right or wrong depending on one critical variable: whether the energy system is ready. The computing power will almost certainly be available. The big question is whether sufficient energy infrastructure will exist to power orbital data centers when they become necessary .
Meanwhile, on Earth, a startup called Soma Energy is tackling the immediate power crisis facing data centers. The company, founded by the team that built and operated energy systems at Amazon Web Services, recently raised $7 million in seed and pre-seed funding to deploy artificial intelligence (AI) that optimizes existing grid capacity in real time . Rather than waiting for new power plants to be built, which takes five to ten years, Soma Energy's platform unlocks capacity already present in the grid but currently underutilized, delivering power to energy-intensive facilities in months rather than years .
"Building new generation and upgrading transmission operates on timelines of five to ten years and requires hundreds of billions in capital investment. This timeline is fundamentally incompatible with the pace of AI infrastructure deployment," said Villi Iltchev, Partner at Category Ventures.
Villi Iltchev, Partner at Category Ventures
The company is already optimizing two gigawatts of electricity for power-producing clients and actively working with five data center customers . For data centers, Soma Energy connects on-site generation, storage, and load into a single control layer, transforming large facilities into flexible grid assets that can unlock additional capacity from existing infrastructure .
The convergence of these trends points to a clear conclusion: the future of AI infrastructure, whether on Earth or in orbit, depends less on computing power and more on energy innovation. Companies and investors that treat power as a foundational problem rather than an afterthought will be the ones that actually build the AI economy of the future.