
By Dan Gearino | Inside Climate News
This article originally appeared on Inside Climate News, a nonprofit, non-partisan news organization that covers climate, energy and the environment. Sign up for their newsletter here.
The U.S. electricity grid can handle much of the projected increase in demand from artificial intelligence data centers by becoming more flexible, which can reduce the need to build new power plants, a new analysis concludes.
Flexibility, in this case, means that grid operators would work with their largest customers to be better able to reduce electricity use at times of highest demand.
“The United States right now is in a very consequential economic race around artificial intelligence, and this paper suggests that these new mega loads can be added relatively quickly, if they’re able to embrace some degree of flexibility,” said Tyler Norris, a fellow at the Nicholas School of the Environment at Duke University.
He is lead author of the report, Rethinking Load Growth: Assessing the Potential for Integration of Large Flexible Loads in U.S. Power Systems, issued today by Duke’s Nicholas Institute for Energy, Environment & Sustainability.
The report finds that the country’s regional grids have substantial headroom in generating capacity available to serve large new users. The key is for grid operators and large customers to work together to make short-term reductions in demand at times when the electricity supply is tightest.
The potential benefits are huge. The report finds that the country could add 76 gigawatts of new electricity demand—the equivalent of about 10 percent of the country’s peak demand—with existing grid resources if these new users were able to ramp down for 0.25 percent of time that they’re ordinarily active.
During the times that the large customers are ramping down, they could switch to short-term power sources they have on onsite, transfer computing work to other locations for a few hours or pause work.
Users wouldn’t have to ramp down for very long, with an average duration of less than two hours. By being able to do so, they would reduce the costs of the electricity system because there would be much less need to build new power plants.
Leading data center companies are already exploring ways that they can alter their electricity demand to respond to the availability of power and other market conditions, Norris said. The paper urges companies and grid operators to be proactive in making arrangements during the planning of new facilities.
The backdrop for this research is that power grid planners are anxious about how they’ll meet the needs of new data centers and anticipated growth in demand from manufacturing, air conditioning and electric vehicles, among others.
“We need to make sure that the electricity grid can handle all of these new loads, and we need to make sure that emissions go down instead of going up.”— Costa Samaras, Scott Institute for Energy Innovation
At the same time, power companies are almost gleeful at the idea of rising demand for their product, which is inspiring talk of a building boom for new power plants and delays in retiring old ones. Consumer and environmental advocates have concerns that a wave of new power plant construction will lead to big increases in utility costs for vulnerable customers and in greenhouse gas emissions.
What hasn’t gotten enough discussion is how to manage growth in electricity demand in a way that emphasizes efficiency and flexibility, said Costa Samaras, director of the Scott Institute for Energy Innovation at Carnegie Mellon University.
“This work from Duke starts to quantify the real benefit of a smarter, more efficient, more responsive electricity system from the demand side,” he said, speaking about an advance copy of the report. “We need to make sure that the electricity grid can handle all of these new loads, and we need to make sure that emissions go down instead of going up, and we need to make sure that people can afford electricity, and flexibility is a no-brainer way to move toward those goals.”
It’s not a new idea for companies to agree to reduce their energy use when called upon. This is called “demand response” and it can take many forms. What’s new, or at least recent, is applying the principles of demand response to AI data centers.
Data center companies and grid operators have strong interests in figuring out how to reduce their electricity demand because the result would be a grid that costs less to operate.
The challenges are mostly with implementation, which has to do with the interplay between regulators, grid operators and large electricity users.
The Duke report says that an agreement on how this flexibility would work can be written into the basic contracts that companies sign as part of getting access to the grid. The results could be a standardization of methods of managing electricity demand.
“A lot of this is contractual innovation,” Norris said, acknowledging that it may sound boring when put that way, but it “could be so consequential.”