The AI industry is burning through physical resources at a rate that would have seemed science fiction five years ago โ and the bill is only beginning to come due.
The Electricity Problem Nobody Is Pricing In
Training a frontier model is not cheap in any dimension. Estimates put the training compute for GPT-4 at roughly 50 gigawatt-hours โ enough to power about 4,600 average U.S. homes for an entire year. That was 2023. Models have since gotten larger and more expensive to train, not smaller.
Inference scales the problem into a different category entirely. ChatGPT reportedly handles over 10 million queries per day. Each query consumes approximately 10x the electricity of a standard Google search. Multiply that across every AI product deployed at scale โ Copilot, Gemini, Claude, Grok โ and you are looking at a power draw that is restructuring national energy policy.
Google, Microsoft, Meta, and Amazon collectively committed over $320 billion in capital expenditure in 2025, the majority directed at data center buildout. Microsoft alone announced $80 billion for fiscal 2025. These are not software numbers. These are the capex profiles of utility companies and real estate developers. That shift in the nature of AI spending is not being reflected in how most investors think about the category.
Water Is the Constraint Nobody Talks About
Data centers use enormous volumes of water to cool their servers. Microsoft's water consumption grew 34% between 2021 and 2022 โ before the current AI buildout even hit full speed. Google's water use rose 20% in 2023 year over year. One estimate from researchers at UC Riverside and UT Arlington found that ChatGPT consumes roughly 500 milliliters of water per 20 to 50 prompts โ a number that compounds into millions of gallons per day at scale.
This is not an abstract environmental concern. It is a supply chain risk. Data centers are competing with agriculture, municipalities, and manufacturing for access to water in regions already under stress. Arizona, Georgia, and parts of the Pacific Northwest โ all major data center hubs โ are facing regulatory pushback and permitting delays specifically tied to AI infrastructure water use. That friction translates directly into timeline risk for anyone building or leasing capacity.
What This Actually Means for the Stack
- โขNuclear is back. Microsoft signed a 20-year deal to restart Three Mile Island to power its Virginia data centers. Google struck a deal with Kairos Power to deploy small modular reactors. This is not a climate story โ it is an AI infrastructure story.
- โขData center real estate is now a strategic asset. Equinix, Digital Realty, and Iron Mountain are trading at premiums that reflect scarcity, not just cash flow. Land near power substations with water access has become a category of its own.
- โขPower purchase agreements (PPAs) are the new moat. Hyperscalers are locking up gigawatts of generation capacity years in advance. Startups relying on spot market power pricing are structurally disadvantaged as demand outstrips supply.
- โขGeography is becoming a competitive differentiator. Countries with abundant hydro power โ Canada, Norway, Iceland โ are suddenly attractive AI infrastructure destinations. Singapore has imposed a moratorium on new data centers. Ireland is rationing grid capacity.
- โขCooling innovation is a real investment category. Liquid cooling, immersion cooling, and purpose-built thermal management startups are seeing serious capital inflows. The air-cooled data center is becoming economically obsolete as chip densities climb.
The Investment Angle Most VCs Are Missing
I've watched the AI investment narrative almost entirely focus on application layer plays and model providers. Both are interesting. Neither is where the most durable infrastructure value is accumulating. The companies that own the physical substrate โ power, cooling, land, grid interconnection rights โ are building moats that software cannot replicate.
The International Energy Agency projects that data centers will consume 1,000 terawatt-hours globally by 2026 โ roughly the current electricity consumption of Japan. That figure will continue climbing as AI workloads intensify. Grid infrastructure was not built for this. Transmission capacity is a 10 to 15 year build cycle. The mismatch between demand and physical infrastructure is the defining constraint of AI scaling for the next decade.
The smart money is already moving. Blackstone, Brookfield, and KKR have all made multi-billion-dollar bets on data center real estate and power infrastructure in the last 18 months. When private equity at that scale is buying physical AI infrastructure, the signal is clear: the scarcity value of these assets is real and will compound.
The next AI bottleneck is not compute or talent. It is electrons and acres โ and neither scales at software speed.
Stay current with VC and startup trends at Value Add VC. Originally published in the Trace Cohen newsletter.