The $2 Trillion Data Center Insurance Gap: Why AI's Power Boom Is Outpacing Wall Street's Safety Net
The insurance industry is struggling to keep pace with the explosive growth of AI data centers, with roughly one-third to one-half of the total project value at hyperscale campuses now sitting outside traditional insurance coverage. Annual investment in data centers could surpass $300 billion by 2027, according to S&P Global Ratings, yet the insurance market that has long underpinned large construction projects is hitting hard limits on how much risk it can absorb.
The problem is simple but staggering in scale: data center projects have grown from $1 billion to $2.5 billion just two years ago to between $5 billion and $25 billion today. That rapid escalation has broken a fundamental industry norm of insuring projects to their full replacement value. "You can't really buy $20 billion insurance on a $20 billion project," explained Sedat Kunt, national builders' risk practice leader at Marsh McLennan.
Why Are Data Centers So Hard to Insure?
Insurers have shifted from covering full project costs to using a metric called probable maximum loss, or PML, which estimates the largest likely loss from specific perils like fire or severe weather. For a $10 billion data center project, Kunt noted that his starting insurance limit would be $2.5 billion, with typical policy limits ranging between $1.5 billion and $3.5 billion across the industry.
The geographic concentration of these facilities amplifies the risk calculation. More than 40 percent of U.S. data center capacity is located in areas significantly exposed to tornado risk, with more than one-quarter exposed to substantial hail risk. These environmental factors directly influence how insurers model and price coverage, making traditional underwriting approaches inadequate for the scale of modern hyperscale campuses.
"Data centers are not just buildings, they are highly integrated systems where you have power, cooling, hardware and software that all depend on each other," explained Jimmy Keime, head of engineering and nuclear at Swiss Re.
Jimmy Keime, Head of Engineering and Nuclear at Swiss Re
The coverage gap is further complicated by how builders' risk insurance is traditionally structured. Most policies cover only the core and shell of a building. Once owners and tenants begin installing high-value computing hardware, that equipment is covered under separate operational property programs, creating fragmentation in the insurance landscape.
How Are Hyperscalers and Insurers Bridging the Coverage Gap?
- Layered Coverage Structures: Projects increasingly rely on multiple carriers participating in quota share arrangements, where insurers split the risk so no single carrier commits too much capacity to one project.
- Phasing Endorsements: Large, multi-phase campuses use endorsements that allow coverage to roll off completed portions while construction continues elsewhere, helping manage costs and complexity.
- Specialized Partnerships: Developers work with specialty insurers like FM Global, known for stringent building-design requirements, to secure higher limits and holistic coverage strategies that span both construction and operational phases.
- Calculated Risk Retention: Hyperscalers are increasingly retaining portions of the uninsured risk themselves, depending on their financing requirements and risk tolerance.
Amy Iannone, insurance and risk management leader at DPR Construction, noted that the compressed timeline between preconstruction and groundbreaking has added urgency to these arrangements. "The time from preconstruction to construction has shortened substantially, so we develop strategies to get project-specific insurance programs built quickly," she stated.
What Does This Mean for the Global Insurance Market?
The scale of the opportunity is attracting significant capital. S&P Global Ratings estimates that roughly 11,000 data centers are currently in operation globally, representing a total insurable asset base of more than $2 trillion. The firm projects that rising demand could generate $10 billion in new insurance premiums in 2026 alone, roughly twice the annual premiums generated by the global aviation insurance market.
Swiss Re similarly estimates that the global premium pool tied to data centers could grow from about $10.6 billion today to $24.2 billion by 2030. However, S&P Global Ratings cautions that no single carrier can absorb these risks alone, making collaborative structures and alternative capital increasingly central to how the market functions.
"The biggest change we see right now is the size and scale of these data centers. Two years ago, maybe $1 billion to $2.5 billion in size. Now they seem to be anywhere between $5 billion to $25 billion," said Sedat Kunt.
Sedat Kunt, National Builders' Risk Practice Leader at Marsh McLennan
The insurance gap has direct implications for capital formation and project financing. Lenders are pressing for limits that cover full construction costs, demands the market cannot meet at commercially viable rates. This mismatch is forcing a fundamental rethinking of how mega-infrastructure projects get financed and who bears the risk when catastrophic events occur.
Patricia Kwan, a primary analyst for S&P's data center insurance research, emphasized that the gap reflects both capacity limits and underwriting discipline. "The gap is really on those hyperscale data centers, and the gap will persist as long as they are that big and the insurance market is not providing those covers," she noted.
As AI infrastructure spending accelerates globally, the insurance industry faces a critical test: whether it can innovate fast enough to match the scale and complexity of the hyperscale data centers powering the next generation of artificial intelligence. The answer will shape not just insurance markets, but the pace at which AI infrastructure itself can be deployed.