Skip to content
AI Usage4 min read

AI Gas Power Plans Could Emit 129 Million Tons a Year

A WIRED review of air permits finds 11 gas-powered AI data center projects could emit more than 129 million tons of greenhouse gases per year.

AB

Author

AUG Bot

Published

Data center campus with gas power infrastructure and emissions

AI Gas Power Plans Could Emit 129 Million Tons a Year

Permit reviews show how behind-the-meter power could raise AI's carbon cost

A WIRED review of air permit documents found that natural gas projects tied to 11 U.S. data center campuses could emit more than 129 million tons of greenhouse gases per year if run at permitted levels. The projects are linked to AI infrastructure serving companies including OpenAI, Meta, Microsoft, and xAI, showing how fast AI power demand is moving from the grid into dedicated fossil-fuel generation.

Key details

The review focuses on behind-the-meter power: power plants built mainly to serve data centers rather than ordinary grid demand. Developers are using this route because large AI campuses can face long waits for utility connections and public concern over who pays for new power infrastructure.

Permit documents reviewed by WIRED show several large potential emissions sources. xAI-related gas turbines in Memphis, Tennessee, and Southaven, Mississippi, could each emit more than 6.4 million tons of CO2 equivalent per year under permit assumptions. A Chevron-backed project in West Texas reportedly under consideration for Microsoft power purchasing could emit more than 11.5 million tons per year. Three Stargate-affiliated gas projects linked to AI infrastructure in Texas and New Mexico could together emit more than 24 million tons per year.

The numbers are maximum permitted emissions, not guaranteed real-world output. Power plants rarely run at full capacity all year, and some developers say actual emissions could be far lower. But data centers are unusually steady electricity users, so dedicated plants serving AI workloads may operate closer to constant demand than typical grid-connected generators.

Why this matters

AI deployment is no longer just a software question. Every large model service depends on data centers that need continuous electricity, cooling, land, substations, transmission equipment, and backup power. If grid queues push developers toward private gas generation, AI's resource footprint can shift from abstract compute demand into measurable carbon emissions.

The 129 million-ton figure is large enough to make the build-out visible at national climate scale. It also raises accountability questions for companies that have promised emissions cuts while expanding compute capacity.

Context

The AI infrastructure race has already strained power planning in major data center regions. Utilities, state regulators, and local communities are weighing new load requests against electricity prices, grid reliability, air pollution, water use, and land impacts.

Behind-the-meter gas power is attractive because it can be faster than waiting for transmission upgrades. The tradeoff is that dedicated fossil generation can lock in emissions and local pollution near communities that may not directly benefit from the AI services those plants power.

Risks and open questions

The biggest uncertainty is how many proposed plants will actually be built and how often they will run. An air permit allows emissions; it does not prove that a project will reach full operation. Some projects may be modified, delayed, canceled, or replaced with cleaner power.

The harder question is whether behind-the-meter gas becomes a temporary bridge or a durable pattern. If AI companies normalize private fossil plants as the fastest path to more compute, emissions could grow faster than public reporting systems and local regulators can track.

What happens next

Watch state air permit filings, utility interconnection queues, and company sustainability reports. The most important signals will be whether developers lower permitted emissions, add carbon capture, switch to renewable or nuclear supply, or continue building dedicated gas capacity for AI campuses.


Source: WIRED Published on AI Usage Global, author: AUG Bot

Older post
Related

Read more

More posts that expand on the topics, companies, and AI trends covered in this story.

Abstract representation of neural network architecture and data compression
AI Usage

DeepSeek V4 Slashes Inference Costs with New Architecture

DeepSeek V4 introduces hybrid attention mechanisms and 4-bit precision to reduce KV cache memory usage by up to 13x, significantly lowering inference costs.

Server rack and semiconductor components
AI Usage

AI Server Demand Triggers Global Component Shortage for 2026

TrendForce downgrades server growth forecasts as AI hardware demand creates critical shortages of power and management chips, with lead times stretching up to 40 weeks.

Data center infrastructure and energy grid
AI Usage

UK Firms Offshoring AI Workloads Due to High Energy Costs

A new report finds that 20% of UK firms have moved AI workloads abroad as high electricity prices and grid bottlenecks hinder domestic AI infrastructure growth.