Example Image
Topic
Economic Dynamism
Published on
Apr 23, 2025
Contributors
Rachel Lomasky
Lynne Kiesling
Large Language Model Training Cluster. (Shutterstock)

Why the AI Revolution Will Require Massive Energy Resources

Contributors
Rachel Lomasky
Rachel Lomasky
Rachel Lomasky
Lynne Kiesling
Lynne Kiesling
Lynne Kiesling
Summary
Training large-scale models requires enormous computation, fueled by multimodal datasets comprising text, audio, video, images, and sensor data that often exceed a petabyte.
Summary
Training large-scale models requires enormous computation, fueled by multimodal datasets comprising text, audio, video, images, and sensor data that often exceed a petabyte.
Listen to this article

The rapid rise of generative AI has triggered a sharp escalation in data center electricity consumption, with profound implications for national energy use, system planning, and climate goals. Data centers have long been critical infrastructure for digital services, but their energy demand is now accelerating due to the emergence of compute-intensive AI workloads.

Data center electricity use began climbing after plateauing around 60 terawatt-hours (TWh) annually from 2014 to 2016—roughly 1.5 percent of total U.S. electricity consumption. By 2018, it had reached 76 TWh (1.9 percent of national demand), driven by growing server installations and an architectural shift toward AI-optimized hardware, particularly Graphics Processing Units (GPUs). This upward trend has since intensified. By 2023, U.S. data center electricity consumption had surged to an estimated 176 TWh, representing 4.4 percent of total U.S. demand, roughly equivalent to the annual electricity use of the entire state of New York.

This growth shows no signs of slowing. The U.S. Department of Energy projects that by 2028, annual electricity demand from data centers could reach between 325 TWh and 580 TWh, or 6.7 percent to 12 percent of projected national consumption. Forecasts from firms such as Boston Consulting Group and S&P Global similarly place 2030 data center electricity use between 5 percent and 10 percent of total U.S. demand. The range of estimates reflects uncertainty in how quickly AI technologies will be adopted and how widely compute-intensive applications will scale.

At the heart of this demand surge is generative AI. Training large-scale models requires enormous computation, fueled by multimodal datasets comprising text, audio, video, images, and sensor data that often exceed a petabyte. While data acquisition and storage carry energy costs, the training process itself is far more energy-intensive, as it depends on the model's size, the complexity of its architecture, and the degree of refinement. Training is a one-time event per model but demands vast amounts of power, time, and hardware resources.

After training, models are used for inference, generating outputs in response to user queries. Each inference consumes far less energy than training, but because these systems are queried millions of times daily, their cumulative energy use becomes substantial. More complex outputs, such as videos or high-resolution images, increase the burden.

Generative AI workloads depend heavily on specialized chip architecture: (GPUs) and Tensor Processing Units (TPUs). These chips are optimized for the matrix operations at the core of AI computation. While they are more efficient than general-purpose CPUs for such tasks, they also draw significantly more power and generate more heat. As a result, they require constant and often intensive cooling, which in turn demands additional electricity and, in many cases, fresh water. Marginal improvements in chip design, such as more compact transistor layouts and power-aware software, have improved performance per watt. Similarly, advances in cooling that range from more efficient fans and heatsinks to liquid cooling and immersion systems help reduce waste heat. However, these innovations have not yet offset the exponential growth in demand.

One promising way to mitigate energy use is to reduce the computational intensity of the algorithms themselves. Smaller, specialized models can be trained with less data, lower numerical precision, and fewer iterations, making them faster and less costly. Techniques like transfer learning, where a pre-trained model is adapted for a new task, and federated learning, where training is distributed across edge devices rather than centralized, can also conserve energy and reduce data transfer loads.

Still, overall energy demand continues to rise—a textbook example of the Jevons Paradox, where efficiency gains lower costs but stimulate greater total consumption. Yet generative AI may also produce net energy savings in other sectors. For example, dynamic routing algorithms can optimize delivery truck routes based on real-time traffic and weather data, reducing fuel use. Similar gains are possible in building HVAC control, precision agriculture, and industrial automation. Thus, while AI’s direct energy footprint is growing rapidly, its broader potential to improve energy efficiency while increasing economic productivity may partially offset these impacts.

Lynne Kiesling is Director of the Institute for Regulatory Law & Economics at Northwestern Pritzker's Center on Law, Business, and Economics; Research Professor at the University of Colorado, Denver.

Rachel Lomasky is Chief Data Scientist at Flux, a company that helps organizations do responsible AI.

10:13
1x
10:13
More articles

The Moral Case for America in a Nutshell

Pursuit of Happiness
Jan 15, 2026

Kneecapping Powell, Undermining the Rule of Law

Constitutionalism
Jan 15, 2026
View all

Join the newsletter

Receive new publications, news, and updates from the Civitas Institute.

Sign up
The latest from
Economic Dynamism
View all
The Poverty of Vanceonomics
The Poverty of Vanceonomics

At the core of Vanceonomics is a preferential option for government intervention.

Samuel Gregg
January 14, 2026
Entrepreneurial Freedom in the Age of AI
Entrepreneurial Freedom in the Age of AI

The AI economy can lead to a more inclusive economy that permits people the freedom to choose when they work, what they work on, and who they work for.

Kevin Frazier
January 12, 2026
Why Can't the Middle Class Invest Like Mitt Romney?
Why Can't the Middle Class Invest Like Mitt Romney?

Why can’t middle-income Americans pay effectively no taxes on investments like the wealthy do? 

Michael Toth
December 30, 2025
The Revenge of the Supply-Siders
The Revenge of the Supply-Siders

Trump would do well to heed his supply-side advisers again and avoid the populist Keynesian shortcuts of stimulus checks or easy money.

Paul Mueller
December 17, 2025
U.S. Can’t Cave to Europe’s Anti-Growth Agenda
U.S. Can’t Cave to Europe’s Anti-Growth Agenda

One does not have to support protectionist tariffs or protracted trade wars to see why Washington needs to continue using trade to pressure Eurocrats to give up micromanaging tech platforms and supply chains around the world. 

Michael Toth
December 15, 2025
Civitas Outlook
Limiting the Federal Government

Failure to consider the sponsors’ representations of the Constitution’s meaning seriously impairs the quality of interpretation.

Civitas Outlook
The Moral Case for America in a Nutshell

America’s proponents now have no choice but to articulate their own simple and effective moral case for our way of life.

Join the newsletter

Get the Civitas Outlook daily digest, plus new research and events.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Ideas for
Prosperity

Tomorrow’s leaders need better, bolder ideas about how to make our society freer and more prosperous. That’s why the Civitas Institute exists, plain and simple.
Discover more at Civitas