AI Data Centers


The physical infrastructure of the AI revolution — vast server farms consuming the electricity of mid-sized cities and the water of small towns — is being built faster than the grid can support, largely without public deliberation.


  • Data centers already consume approximately 1–1.5% of global electricity, and the IEA projects this could double by 2026 as AI workloads surge.
  • A single large AI training run — such as those required for frontier models like GPT-4 — can consume tens of gigawatt-hours of electricity, equivalent to the annual consumption of thousands of American households.
  • Major tech companies have quietly abandoned or delayed their net-zero commitments as AI energy demand outpaces the buildout of renewable capacity — Microsoft's emissions rose 30% between 2020 and 2024.
  • Data centers are increasingly sited in communities that have little political power to negotiate the terms of their arrival, trading tax breaks for jobs that rarely materialize at the promised scale.

A data center is a facility housing the computing infrastructure — servers, networking equipment, and storage — required to run digital services. AI has transformed the scale and energy intensity of data center construction. The 'hyperscale' data centers being built by Microsoft, Google, Amazon, and Meta to train and serve AI models are among the most energy-intensive buildings ever constructed. A single facility can draw 100–500 megawatts of electricity — enough to power a small city — and require hundreds of millions of gallons of water per year for cooling. The buildout is accelerating rapidly: Microsoft announced $80 billion in data center investment in 2025 alone, with roughly half concentrated in the United States.

The computational demands of AI arise at two stages. The first is training: building a frontier model requires running calculations across tens of thousands of specialized chips continuously for weeks or months. Training GPT-4 is estimated to have consumed around 50 gigawatt-hours of electricity — though OpenAI has not disclosed this figure and independent estimates vary widely. The second stage is inference: every time someone uses an AI chatbot, an image generator, or an AI-assisted coding tool, energy is consumed. As AI becomes embedded in search engines, office software, and consumer applications used billions of times daily, inference costs are rapidly eclipsing training costs as the dominant energy load on data center infrastructure.

The energy sources powering AI data centers are a significant and underreported issue. Tech companies have made sweeping public commitments to run on 100% renewable energy, but most use Renewable Energy Credits (RECs) — financial instruments that allow a company to claim renewable status by purchasing credits equivalent to the renewable energy it consumes, without actually consuming renewable electrons at the time power is drawn. When AI data centers are built in regions where the grid runs primarily on fossil fuels — as many in the American South and Midwest do — their actual carbon intensity is substantially higher than their public claims suggest.

The siting of data centers is a political economy story as much as a technical one. Municipalities and states compete aggressively to attract data center investment, offering property tax abatements, subsidized electricity rates, and streamlined permitting. The promised benefits — primarily tax revenue and local jobs — frequently fail to materialize at scale. A hyperscale data center may employ as few as 30–50 permanent workers while drawing enormous amounts of electricity from the local grid, straining infrastructure and raising costs for other ratepayers. The communities bearing the environmental costs — water consumption, heat island effects, grid instability — are often rural or lower-income communities with limited political leverage to negotiate on their own behalf.

The electricity demand from AI data centers is becoming a material problem for grid operators and U.S. climate goals. The IEA's 2024 Electricity report projected that data center electricity consumption in the United States could exceed 6% of total national demand by 2026 — up from roughly 4% in 2023. In some regional grids, new data center connections are already creating queue delays for other industrial customers and driving up wholesale electricity prices. PJM Interconnection, the grid operator for 13 Eastern states and home to over 60 million people, has warned that retiring fossil fuel plants combined with surging data center demand is creating reliability risks that may require burning more coal in the near term.

The collision between AI energy demand and decarbonization commitments is increasingly visible and measurable. Microsoft's 2024 sustainability report showed its carbon emissions had increased 30% since 2020, attributing the rise directly to data center construction for AI. Google's 2024 environmental report showed a 48% increase in greenhouse gas emissions since 2019. Both companies have 2030 net-zero targets they are now moving further from, not closer to. The AI-driven emissions increase is real and measurable — it is not offset by renewable energy credits in any physical sense, and treating it as such is a form of accounting fiction.

Nuclear energy has re-entered the conversation as a potential solution to data center electricity demand. Microsoft signed a deal in 2024 to restart the Three Mile Island nuclear plant in Pennsylvania specifically to power its data centers. Google announced agreements for power from advanced nuclear reactors. This has accelerated investment in nuclear startups, particularly those working on small modular reactors (SMRs). Whether SMRs will deliver power at the scale and timeline AI companies require remains deeply uncertain — most estimates place commercial SMR deployment in the 2030s, not the 2020s — making nuclear a speculative answer to an immediate problem.

The data center buildout represents a massive private infrastructure investment being made without meaningful public deliberation about its costs, location, or long-term implications. Decisions about where to site gigawatts of new computing infrastructure — with generational consequences for local water supplies, grid stability, and carbon emissions — are being made by corporate real estate teams optimizing for electricity prices and tax incentives, not by democratic processes weighing community interests. The relevant governance frameworks either don't exist or lack enforcement power. This is not an argument against AI infrastructure; it is an argument that infrastructure with such broad public consequences requires public accountability structures that are currently absent.


Sources & Further Reading

  1. Electricity 2024: Analysis and Forecast to 2026 International Energy Agency (2024)
  2. Data Centres and Data Transmission Networks International Energy Agency (2023)
  3. Microsoft 2024 Environmental Sustainability Report Microsoft (2024)
  4. Energy and Policy Considerations for Deep Learning in NLP arXiv (2021)
  5. What Is a Small Modular Reactor? U.S. Department of Energy (2023)