How AI Data Centers Are Driving U.S. Electricity Prices in 2026

How AI Data Centers Are Driving U.S. Electricity Prices in 2026

America is facing rising electricity costs driven by the expansion of AI data centres. While not yet reaching the systemic economic impact of healthcare, the cost of securing electricity in 13 states has surged from $29 to $330 per megawatt-day—an increase of 830% in just two years. The rapid growth of AI infrastructure has transformed a previously forecasted energy challenge into an immediate concern for utilities, regulators, and markets.

For years, experts anticipated an AI-driven energy crunch, and it is now materialising. Silicon Valley data centres are drawing extra gigawatts from ageing nuclear plants and natural gas turbines. 

Some long-term “moonshot” proposals, such as orbital data centres, exist conceptually, but they remain theoretical when compared to immediate solutions like restarting existing nuclear facilities, including Three Mile Island.

Moore’s Law vs. AI Demand

Initially, Moore’s law suggested that computing efficiency would keep energy demands manageable. Gordon Moore’s observation—that transistor density doubles roughly every two years—implies steadily faster, cheaper, and more energy-efficient computing. Yet AI’s power appetite is doubling every six months, outpacing efficiency improvements.

A key factor is the evolution of large language models. Early AI was essentially glorified autocomplete: predicting the next word and moving on. By early 2026, the leading models—Google’s Gemini 3 Pro and OpenAI’s ChatGPT-5—have transformed into reasoning engines. They incorporate advanced reasoning modules and chains of thought spanning thousands of steps, checking logic, backtracking on inconsistencies, and simulating multiple outcomes before generating a single response.

Microsoft researchers estimate that a standard query consumes 0.34 watt-hours, but a reasoning task can burn 4.32 watt-hours—13 times more. While individual users have limited impact, enterprise AI agents operate continuously, running thousands of reasoning chains per hour. Unlike traditional office loads, these agents require “firm power”—electricity that cannot fluctuate—placing new demands on an infrastructure built for peaky, predictable loads.

The Jevons Paradox in Action

This dynamic illustrates the Jevons paradox: increased efficiency does not reduce consumption; it can accelerate it. Hyperscalers such as OpenAI and Google subsidise AI costs to maintain market share, making high-powered intelligence artificially abundant. Continuous operation is profitable, yet the grid was never designed for this scale of 24/7 demand.

PJM Interconnection and Capacity Market Dynamics

The PJM Interconnection, serving 65 million people across 13 states, highlights the immediate pressures. PJM operates a capacity market, paying power plants to guarantee supply during peak demand. 

In 2024, securing this capacity cost $28.92 per megawatt-day. The 2025 auction jumped to $269, and projections for the 2027 auction indicate it could reach the federal cap of $333.44 per megawatt-day. 

PJM forecasts a single-year load increase of 5,250 megawatts—roughly five nuclear reactors’ worth—driven 97% by data centers. While these numbers are concerning, utilities are actively modernizing infrastructure, upgrading transformers, and expanding transmission capacity to meet these emerging needs.

Regulatory Measures and Modernisation Efforts

Regulators are implementing measures to prevent grid instability. Ohio introduced a “take-or-pay” tariff requiring data centres to pay for 85% of reserved capacity, whether used or not, ending speculative hoarding. 

Virginia created a new rate class isolating high-load centres from residential billing. FERC standardised collocation rules, ensuring direct connections to power plants pay fair grid rates. 

Utilities are also investing in modernisation projects, integrating advanced monitoring, and upgrading equipment to handle high-density AI clusters safely.

Energy Sources and Infrastructure

Nuclear plants, while slow to restart, provide firm, carbon-free base load power. Gas turbines allow rapid deployment to meet immediate demand, delaying coal retirements where necessary. 

Meanwhile, high-density AI clusters generate significant heat, requiring liquid cooling solutions and upgraded switchgear. Companies such as Vertiv, Eaton, and Schneider Electric provide the critical infrastructure for safely handling these power-intensive systems.

Global Competition

The U.S. faces global pressure. While domestic grids struggle with legacy systems, China added renewable capacity equivalent to France, the UK, and Germany combined in just 12 months. 

Ultra-high-voltage transmission efficiently delivers power to eastern AI clusters. Abundant, inexpensive electricity allows China to scale AI operations even with older, less advanced chips, potentially surpassing U.S. capabilities.

Looking Ahead

Three trends appear likely:

  1. Grid strain will continue before easing. Infrastructure expansion lags hyperscaler demand, and extreme events could trigger temporary “AI blackouts.”
  2. Investment focus is shifting. With GPUs no longer the bottleneck, power generation, fuel supply, and infrastructure companies are gaining pricing power in a constrained market.
  3. Regulatory attention will grow. More states may implement measures to protect residential consumers or restrict new AI data center construction to maintain grid stability.

The AI boom continues, but it is becoming more costly and complex. Companies that secure long-term power, build where the grid can support them, or explore longer-term “moonshot” solutions such as orbital data centers will have a strategic advantage. Meanwhile, utilities are actively modernizing and innovating to meet the unprecedented demand, balancing economic growth with grid reliability.


Follow Storyantra for more in-depth stories, technology news, data-driven insights, and timely updates on the forces shaping our future.

Post a Comment

0 Comments