AI’s power play: Meeting the growing energy demands of data centers

A keynote panel discusses growing electricity demand at POWERGEN International on February 11, 2025. From left to right: Lon Huber, Duke Energy; Mary Faulk, Salt River Project; Marc Spieler, NVIDIA; Michelle De Blasi, Esq., environmental and energy attorney; Blair Loftis, Terracon Inc.

For decades, power providers in the United States could count on electricity demand remaining more or less constant. When a new large load was connected to the grid, it generally fell within the margin of error of a given system. Not anymore.

“We had 1.2 gigawatts (GW) on just one site,” shared Duke Energy’s senior vice president of pricing and customer solutions, Lon Huber. “That’s outside the bounds of that model.”

The proliferation of data centers and artificial intelligence (AI), increased onshoring of manufacturing, and global electrification trends pose dilemmas for utilities, forced to pivot from their traditional modus operandi. The pillar of reliability they once leaned on is listing; safe is now dead.

The International Energy Agency predicts five percent load growth by 2030; other projections paint an even more dire picture, calling for 80 GW of new generation (or more) to satisfy demand. Meanwhile, tech giants like Amazon, Google, and Meta are pursuing ambitious decarbonization initiatives, further complicating matters, especially since construction timelines for data centers, generation sources, and transmission infrastructure don’t exactly jive.

So how do we get there from here?

It’s conceivable AI can be part of the problem and the solution.

the magic of artificial intelligence

NVIDIA’s Marc Spieler, who led off POWERGEN International’s keynote program Tuesday, simplified artificial intelligence for the event’s audience in a panel discussion.

“AI turns data and electrons into knowledge,” he stated. Of course, that doesn’t happen magically. “The amount of compute (computational power) required to do this is significant.”

Although AI isn’t new, it hasn’t been commercially viable until recently, he argued, pointing to Open AI’s ChatGPT as the advancement that kicked down the door for establishing markets around AI. Spieler sees the technology as a means for utilities to solve problems without hiring new people, creating virtual “experts” who don’t retire or demand paid time off.


GO DEEPER: Access to power has become the key driver of AI and data center growth, prompting the need for new solutions among utilities, developers, and other stakeholders. This demand growth will test grid reliability, requiring new ways of collaboration and policy structures.

DTECH Data Centers and AI, taking place May 27-29 in San Jose, California, lives at this intersection of energy and digital infrastructure, exploring the strategies necessary to navigate power constraints, project delays, and the increasing demand for sustainable, flexible solutions.

Registration is now open!


“This is the next industrial revolution- the manufacturing of knowledge,” Spieler predicts.

He thinks AI can be leveraged to help keep electricity prices low while improving operational efficiency across sectors, increasing production, lowering costs, decreasing risk, expediting outcomes (making better decisions, faster), reducing carbon footprint, and improving the customer and employee experience.

forecasting and planning

Inside the PJM control room. Courtesy: PJM

One specific way AI can aid power providers is by assisting with load forecasting. Instead of assuming a new load would fall within the noise of its models, utilities like Duke Energy now keep close track of anything over 20 megawatts (MW) coming online.

“We need to catalog a pipeline of large loads and know load ramps by month,” Duke’s Huber told POWERGEN International’s attendees. “You’ve got to protect the general body of customers, and I think we have.”

To do that, Huber says it’s all about sound forecasting and supporting it with contracts that allow the utility to stagger how it integrates load.

The Salt River Project (SRP), the largest utility in Arizona, is also reevaluating its forecasting in pursuit of good data- to know what to build and when. Maricopa County, home to the Phoenix metro area, has grown into the second-largest data center hub behind “Data Center Alley” in northern Virginia.

“If you look at the pipeline SRP has, we have about 40 large data customers that are under some portion of contract,” said SRP’s director of integrated system planning and support Mary Faulk. If all of them are built out, that will equate to more than 10,000 MW of new load coming online in SRP’s system by 2030.

Water scarcity and extreme temperatures would make it seem like it would be tougher to build data centers in Arizona, but the pros seem to outweigh the cons. Arizona isn’t prone to natural disasters and SRP operates a relatively new system with a diverse energy mix while maintaining top-5 reliability nationally. And the heat isn’t as much of a problem as one might expect. The utility partnered with the University of Arizona to figure out how much more air-cooled data centers would cost compared to water-cooled ones and found there was only a roughly 10% increase in energy use to air-cooled facilities, a more than reasonable tradeoff considering water consumption is conserved by two-thirds.

Build it (a place ripe for data center development) and they (big tech companies) will come. But how much power will they actually require? SRP needs to figure it out.

“We know historically we’ve seen large customers underperform,” Faulk cautioned, but SRP cannot plan for that. Instead, the utility must ensure more than enough generation capacity- no small ask. To do that, SRP is exploring paired solar and storage in addition to some long-duration energy storage (LDES) options like pumped hydro.

“That’s another long-term asset we’re investing in,” Faulk confirmed. “It helps us firm up that generation.”

complicating climate pledges

Inside a Microsoft data center. Courtesy: Microsoft

Decarbonization initiatives, both at the corporate and state levels, are further complicating matters. Amazon has pledged to be net-zero by 2040; Google wants clean energy matching for every hour of the day by 2030; Microsoft wants to be carbon-negative by 2030 and erase its carbon footprint entirely by 2050.

“This isn’t a new challenge, the utilities have been working toward decarbonization for several years,” SRP’s Faulk shared. The utility, aligned with Arizona’s state goal, wants to claim zero net emissions by 2050, but it will (pretty quickly!) have to double the amount of electric infrastructure in its system just to meet short-term load growth projections.

“It really compounds the challenge,” she admitted. “The question is how much load growth? How much decarbonization?”

Like many utilities, SRP sowed the seeds of carbon neutrality several years ago when it started approving swaths of solar and battery energy storage projects. Those are starting to come online, yet the utility’s last integrated resource plan (IRP) showed a need for more firm generation. For Faulk and her team, that means adding natural gas while working toward retiring more than a dozen coal-fired peaker plants.

Sierra Estrella Energy Storage. Source: Salt River Project.

“It is going to be a challenge,” she confessed to the crowd at POWERGEN International, noting reliability and sustainability don’t always go hand-in-hand.

Akin to SRP, Duke Energy is taking an “all of the above” approach to meet demand. Queues in most regional transmission networks are jammed up with mostly renewable energy projects, limiting how quickly utilities can bring them online. While a data center can be constructed in two years or less, it often takes five years or longer for generation assets to progress from approvals to pumping out electrons. And if transmission infrastructure is needed? Good luck- you’ll be waiting for a decade to fight the (metaphorical) fire that’s already at your door.

“Nuclear is the best alignment to those goals,” Duke’s Huber pointed out. “But how long does it take to build a nuclear plant?”

The answer is eight years in the absolute best-case scenario, he contends, but even if somebody decides to go that route to meet demand, they’ll be tying up $20 billion or more for close to a decade. Only a handful of companies in the world can do that- some already are.

“We have customers whose market caps are larger than all utilities combined,” Huber observed. Those companies can therefore take on more risk than utilities can. If a utility committed to building a nuclear plant and the slightest thing went awry, it would be impactful enough to kill off all future development and/or result in bankruptcy. Many utilities are publicly traded, making them accountable to communities and investors in ways other companies are not.

“Is there something that works for all parties?” Huber asked rhetorically. “I have faith that all the pieces can come together, especially with some indications from the new administration regarding ‘energy dominance.”

“There has to be this multi-party partnership,” he suggested. “Our new rate structures will help enable that.”

Duke runs 11 nuclear plants at six sites, totaling about 12 GW of capacity. That makes parts of its service territory stand out like a big shiny nugget of opportunity for those mired in our modern energy gold rush.

“If a data center comes to me in North Carolina, just by plugging into the grid they’re at 50% renewable already because of the nuclear,” Huber explained, which helps meet the aforementioned climate goals of various parties that also need to keep the lights on.

“It brings stability in prices as well,” he added. “Swings in natural gas (prices) can happen, and when they happen, it’s extremely painful.”

But perhaps a little pain is worth it in the long run.

the great compromise

Credit: David J / Creative Commons

The bridge to meeting renewable energy goals while maintaining reliability appears to be natural gas- not exactly carbon-neutral, but not as bad for the environment as other fossil fuels and comparatively easier/faster to get running.

“As supply is constrained, everyone is moving toward gas,” observed Michelle De Blasi, an environmental, natural resources, and energy attorney who runs her own practice. “There are still companies integrating renewables, but the difficulty is intermittency, even with battery storage.”

Of course, adding gas to a state’s energy mix is easier in some places (like Arizona) than it is in others (California, for example). That said, colocating generation with data centers will likely prove to be the fastest way to get them online, and some industry leaders believe those power-sucking locales could actually position themselves as resources for the communities they’re in.

can data centers be flexible resources?

If a data center is colocated with generation, it could theoretically be relied on as a resource to its utility in times of dire need. But data centers that send power back to the grid would fall under FERC regulation, and that’s not something tech companies are going to line up for.

“We’re working on it,” said Michelle De Blasi. “It’s very, very difficult.”

“This is an area where innovation has to occur,” recommended Duke’s Huber, who wondered aloud if utilities could speed up deployment for data center clients if they promise to provide 50-100 hours of power to support grid reliability. “There’s got to be a little flexibility there,” he implored.

EPRI, NVIDIA, and a cavalcade of hyperscalers have teamed up for the DCFlex initiative, planning to deploy five to ten large-scale flexibility hubs, each a living laboratory demonstrating innovative strategies for integrating data centers with the grid under various conditions to facilitate widespread adoption and replication. The type of operations conducted at a given data center might make a difference, too. A military drone hub is probably less likely to volunteer to go offline than a site that can shift Cloud or AI-based operations around to be flexible with its power consumption.

“I believe this will be a thing. Data centers will be a flexible resource,” predicted NVIDIA’s Spieler, who said he’s excited by the potential of partnerships that can bring projects online faster. “I tell people all the time data centers are kind of like the grid, there’s a baseline and a peak load. We have to plan for the peaks but we learn that the numbers are smaller than that on a regular basis… This will change how we provide access to power to these major data centers.”

ways to move faster

Construction on the SunZia Transmission and Wind project in New Mexico. Courtesy: Business Wire

Huber shared a poignant anecdote with POWERGEN International attendees to show how long building stuff can take. “When I was in college at the University of Arizona, they announced the SunZia Wind and Transmission project,” he recalled. It still isn’t finished.

“We’ve got to be able to build again in this country,” Huber asserted, advocating for siting and permitting reform to simplify things. “It shouldn’t take 10+ years for transmission.”

Endless litigation complicates even the most no-brainer projects, he added, like when an environmental group sued the Florida Public Service Commission over community solar projects to support a virtual power plant (VPP). It went all the way to the Florida Supreme Court.

“Everywhere you go, there’s litigation, and it’s endless, the risks that it causes to projects,” Huber lamented. “Do you build a project when there’s a Supreme Court case?”

“You can’t have nice things anymore because even a group you’d think would support clean energy goes and sues,” he said.

Huber admits there must be due process for siting and permitting, but hanging up projects for years over fine details is not an effective way to build things. He hopes the Trump Administration will speed things up.

“We work with all administrations,” he noted. “It’s not a one administration thing, but the rules around it like permitting, siting, and litigation strategies can all be reformed so at least that part of it doesn’t create unnecessary problems for developers.”

“We try to avoid legislation,” offered De Blasi. “It’s very difficult when people use laws as weapons to stop projects for all sorts of reasons and agendas. They’re needed, but there needs to be bipartisan support for reform. Leave your political affiliations at the door- the second you make energy political, we all lose. We have to get past that and figure out what these solutions are.”

NIMBYism (Not In My Backyard-ism) is rampant in some parts of the United States, making community engagement essential to the success of a project.

“Nobody wants transmission anywhere near them, but everyone needs transmission,” Faulk pointed out. SRP tries to assuage concerned citizens by bringing those folks into the stakeholder engagement process. It often doesn’t change their mind, Faulk admits, but the education helps.

Originally published in Factor This Power Engineering.

Getting ‘forever chemicals’ out of the chips race – This Week in Cleantech

This Week in Cleantech is a podcast covering impactful stories in clean energy and climate in 15 minutes or less, featuring John Engel and Paul…

Emergency powers to restart coal plants? – This Week in Cleantech

This Week in Cleantech is a weekly podcast covering the most impactful stories in clean energy and climate in 15 minutes or less featuring John…