AI datacenters now demand 5 gigawatt power

OpenAI, Meta, and xAI are building 5-gigawatt AI data centers—electricity like a small country. Who pays, and can grids and society sustain it?

By AI Twerp • Est. RT 13 min
Ai Business Ai Personal Ai Technology AI Premise Ai Signals

AI datacenters now demand 5 gigawatt power

Imagine three companies claiming as much electricity as a small country consumes. Not twenty years from now, but today. While policymakers debate wind farms and heat pumps, OpenAI, Meta, and Elon Musk’s xAI are quietly building data centers that each require five gigawatts of power. That’s the equivalent of five full scale nuclear plants per site. The question is no longer whether this energy transition will reshape the world, but who foots the bill and what kind of governance decides it.

This is a deeper dive into the energy constraint behind the core signal: Why AGI Won’t Happen by 2037: The Hard Limits of Data & Energy.

The 5 gigawatt scale no one dares say out loud

Five gigawatts sounds abstract until you grasp what it means. One gigawatt of continuous power delivers 8,760 gigawatt hours of electricity annually, enough to permanently supply 830,000 households. A city the size of Amsterdam, to put it concretely. The plans of major tech companies multiply this figure by five, then by multiple locations worldwide.

The Core of the Signal

Three tech giants are now consuming electricity equivalent to entire nations, reshaping how we think about power infrastructure and sustainability. OpenAI, Meta, and xAI each demand five gigawatts per facility, forcing policymakers to confront whether current grids can handle this unprecedented surge. The stakes extend beyond technical challenges to include who bears the environmental and financial costs of this massive expansion.

  • Recognize that data center consumption will more than double by 2030, jumping from 415 to 945 terawatt hours annually and exceeding Japan’s total electricity usage, while AI alone could account for 35-50 percent of this demand.
  • Understand how efficiency gains create paradoxes that increase total consumption, as cheaper AI models drive broader adoption and higher overall energy use despite technological improvements and reduced per-unit costs.
  • Acknowledge that communities near these facilities absorb immediate consequences, from construction accidents and contaminated water supplies to electricity price hikes that disproportionately burden households and small businesses already struggling with energy costs.

The International Energy Agency published its “Energy and AI” report in April 2025, projecting that global data center consumption will double from 415 terawatt hours in 2024 to 945 terawatt hours by 2030. That latter figure exceeds Japan’s current total electricity consumption [1]. The United States accounts for the lion’s share: data centers there already consume more than four percent of national electricity, and this percentage will grow by 133 percent by decade’s end.

Artificial intelligence is the primary engine behind this explosive growth. Where AI was recently responsible for 5 to 15 percent of data center consumption, this share could rise to 35 to 50 percent by 2030 [2]. What sets this apart from earlier technological waves is the speed at which this demand is materializing.

Three tech giants and their gigawatt scale plans

Mark Zuckerberg announced the construction of Hyperion via Threads in July 2025, a data center complex in Louisiana with a target capacity of five gigawatts. The site spans 2,250 hectares in Richland Parish and is expected to cost ten billion dollars. Meta has since secured $29 billion in financing through a joint venture with Blue Owl Capital, comprising $26 billion in debt and $3 billion in equity [3]. Construction began in late 2024 and should be fully operational by 2030, with interim milestones of 1.5 gigawatts by late 2027.

In parallel, Meta is developing the Prometheus complex in Ohio, a one gigawatt data center scheduled to come online in 2026. This positions the company as the first tech giant with complete control over AI infrastructure at this scale. The goal is to develop superintelligence, an ambition Zuckerberg underscored by poaching top talent from OpenAI, Google, and Apple with compensation packages reaching $250 million per person a blunt form of innovation measured in infrastructure, not features.

OpenAI is pursuing a similar path through the Stargate project, announced in January 2025 at a White House press conference featuring President Trump, Sam Altman, and partners Oracle and SoftBank. The initial commitment of $500 billion for ten gigawatts of capacity appears to be on track. In September 2025, OpenAI announced five new data center locations, bringing planned capacity to nearly seven gigawatts and committed investments to more than $400 billion over three years [4]. The flagship location in Abilene, Texas is already operational, with construction crews of 6,400 workers who laid enough fiber optic cable to circle the Earth sixteen times.

xAI, Elon Musk’s AI company, chose a radically different approach. The Colossus complex in Memphis, Tennessee became operational in September 2024 after just nineteen days of construction, a feat Nvidia CEO Jensen Huang called “superhuman” that same month. In late December 2025, Musk announced the purchase of a third building, named MACROHARDRR, bringing total capacity to nearly two gigawatts. The complex is expected to house 555,000 Nvidia GPUs with an estimated acquisition cost of eighteen billion dollars [5].

Why building 5gw datacenters is nearly impossible

Power generation complex supporting AI datacenter demand
Power generation at the edge of the AI buildout

These ambitions collide with hard physical realities. The most recent American nuclear plant, Vogtle 3 and 4 in Georgia, offers a sobering case study. Originally planned with a budget of fourteen billion dollars and a construction timeline of seven to eight years, the project ultimately cost $35 billion and fifteen years to deliver 2.2 gigawatts of capacity. The overruns ran two and a half times the budget and nearly double the planned duration.

The American electrical grid currently has approximately 1,250 gigawatts of installed capacity, of which 96 gigawatts is nuclear. The five gigawatts that a single data center demands represents 5.2 percent of the country’s total nuclear capacity and nearly the entire annual capacity addition of 2024. Grid connection wait times in some regions extend to 2030, which explains why companies like xAI are turning to alternative solutions.

xAI bypasses the grid problem through on site power generation. The company installed dozens of methane gas turbines and purchased a former Duke Energy plant in Mississippi. Mississippi regulators granted temporary approval to run turbines for twelve months without permits, a decision drawing criticism from environmental organizations pointing to already high air pollution levels in the area.

The jevons paradox: efficiency increases total consumption

When China’s DeepSeek presented a model in January 2025 that was allegedly trained with just 2,000 GPUs for $5.6 million, the promise of efficiency finally seemed to become reality. Nvidia lost $600 billion in market value in a single day. Microsoft CEO Satya Nadella referenced the Jevons paradox: when technology becomes more efficient, total consumption can actually rise because more people start using it.

Research from MIT Technology Review supported this perspective with findings that DeepSeek’s “chain of thought” reasoning consumed 87 percent more energy during inference than comparable Meta models. Sasha Luccioni of Hugging Face, named by TIME in September 2024 as one of the hundred most influential people in AI, warned that broad adoption of this paradigm could negate all efficiency gains.

Her earlier research demonstrated that generative AI uses thirty times more energy than a traditional search query, and that image generation is sixty times more energy intensive than text generation. Generating a thousand images produces as much CO2 as driving 6.6 kilometers in a gasoline car.

The strong do what they can and the weak suffer what they must.

– Thucydides (400 BC), History of the Peloponnesian War

Who pays the bill for AI energy demands

The impact of Big Tech’s energy claims extends beyond abstract statistics. In Holly Ridge, Louisiana, a community of two thousand residents near Meta’s Hyperion complex, car accidents rose from nine in all of 2024 to 64 in the first nine months of 2025. A six hundred percent increase due to construction traffic. An elementary school closed the playground at its front entrance after three collisions with construction vehicles. Residents report rust colored tap water and periodic power outages; the impact arrives long before the AI products do.

In the broader energy market, data center expansion is contributing to price increases. In the PJM electricity market, which stretches from Illinois to North Carolina, data centers were responsible for an estimated $9.3 billion price increase in the 2025 to 2026 capacity market. Small businesses and households often bear these costs unless protective measures are implemented.

The Louisiana Energy Users Group, whose members include Exxon Mobil, Chevron, and Shell, warned that Meta’s project increases utility Entergy’s energy demand by thirty percent, creating “unprecedented financial risks” for existing customers. To complicate matters further, Louisiana’s legislature amended the definition of green energy to include natural gas, allowing Zuckerberg’s gas turbines to be presented as a sustainable solution.

The geopolitics of AI compute infrastructure

The data center race is not merely a technological or economic matter; it forms the battleground of a new form of international competition. Senator Ted Cruz summed it up succinctly at a Stargate event in September 2025: “Message number one: America will beat China in the race for AI.” The concentration of AI infrastructure in the United States explicitly serves to protect intellectual property and reduce foreign dependencies, a strategy executed in concrete and kilowatts.

If you want to zoom out from grid capacity into the bigger question of abundance versus control when energy and compute converge, read: AI + Quantum + Fusion: Dawn or Doom?.

OpenAI is simultaneously expanding internationally with Stargate projects in Norway, the United Arab Emirates, the United Kingdom, and Argentina. The Stargate Argentina project in Patagonia represents a $25 billion investment for 500 megawatts of capacity, the largest data center in Latin America.

The effect of this concentration of computing power on global power dynamics has barely registered in public consciousness. Whoever owns the infrastructure to train the most powerful AI models will determine the direction of technological development for decades to come.

The 5 gigawatt future is infrastructure reality

The five gigawatt future is not a dystopian fantasy but an infrastructural reality under construction. The IEA report notes that data centers will account for approximately ten percent of global electricity growth through 2030, still less than electric vehicles or air conditioning, but concentrated in specific regions where grid pressure is acute.

The question for policymakers, investors, and citizens is not whether these facilities will be built, but under what conditions. The speed at which xAI realized its Colossus complex, nineteen days versus the typical four years, demonstrates that technical and regulatory obstacles can be overcome when sufficient capital and political will converge.

What remains is weighing who reaps the benefits and who bears the burden. The history of industrial revolutions teaches that transformations of this magnitude rarely produce evenly distributed winners and losers.

Sam Altman put it this way at the Stargate event in Abilene on September 23, 2025: “This is what it takes to make ChatGPT work. You send a question from your phone and get an answer. You don’t think about it, but a lot of people do, and that requires all this incredible work behind us.”

The megawatt hours are piling up. The question remains who’s counting them and which trends will force that accounting into the open.

Why AGI Won’t Happen by 2037: The Hard Limits of Data & Energy
Connects the 5GW buildout to the deeper constraint: scaling AI hits physical energy ceilings before it hits pure software limits.

AI Tokens: The Secret Economics Behind the AI Boom
Explains how tokens translate into compute demand, making the jump from “more usage” to “more power plants” concrete.

Why Your AI Budget Will Cost 89% More Than Your CFO Realizes
Shows how infrastructure, inference, and electricity costs surface inside real budgets, clarifying who ultimately pays for scale.

References

[1] International Energy Agency. Energy and AI - Executive Summary. April 2025. Available at: https://www.iea.org/reports/energy-and-ai/executive-summary

[2] Kamiya G, Coroamă VC. Data Centre Energy Use: Critical Review of Models and Results. IEA 4E EDNA Platform. 2025. Available at: https://www.iea-4e.org/wp-content/uploads/2025/05/Data-Centre-Energy-Use-Critical-Review-of-Models-and-Results.pdf

[3] Data Center Frontier. Ownership and Power Challenges in Meta’s Hyperion and Prometheus Data Centers. August 2025. Available at: https://www.datacenterfrontier.com/hyperscale/article/55310441/ownership-and-power-challenges-in-metas-hyperion-and-prometheus-data-centers

[4] OpenAI. OpenAI, Oracle, and SoftBank expand Stargate with five new AI data center sites. September 2025. Available at: https://openai.com/index/five-new-stargate-sites/

[5] Introl Blog. xAI Colossus Hits 2 GW: 555,000 GPUs, $18B, Largest AI Site. January 2026. Available at: https://introl.com/blog/xai-colossus-2-gigawatt-expansion-555k-gpus-january-2026