Utilities are embracing AI, but grid applications remain in the ‘sandbox’ — for now

MISO control room (Courtesy: MISO)

REGISTER TODAY! The PG&E Innovation Summit presented by DISTRIBUTECH will take place in San Jose, California on Nov. 13. Attendance is free, but pre-approval is required. Learn more here.

For PG&E, deploying artificial intelligence is a bit like working in a “sandbox”—a controlled space where new tools can be safely tested, refined, and adjusted to meet the demanding standards of the utility industry. It’s a cautious approach from one of the most innovative utilities in the U.S. While most utilities now recognize that machine learning and AI will bring significant changes to the industry, there remains a hesitancy to unleash the tools beyond back-office tasks and onto power grid operations.

Norma Grubb, PG&E’s director of enterprise AI and data science, describes the approach as essential in an industry where even minor errors can have major impacts on infrastructure and customer trust.

“Innovation is a top priority, but safety comes first,” Grubb told POWERGRID International, highlighting the utility’s commitment to rolling out AI in measured stages. For now, PG&E has focused on using AI in back-office tasks, such as processing regulatory documents and managing internal communications—tasks that improve efficiency without directly affecting customer-facing operations.

FULL INTERVIEW: AI is here, and utilities are ready to get out of the ‘sandbox’

Technology leaders are sprinting to serve seemingly every industry with AI-powered tools. The utility industry is complicated, however, due to its inability to fail forward faster when it comes to safety and reliability, on top of regulatory scrutiny.

Jason Strautman, vice president for data science and analytics engineering for Oracle’s utility division, told POWERGRID that deploying AI with utilities presents unique challenges, where reliability is non-negotiable. “This is very different from other industries,” Strautman said, acknowledging that utilities face high stakes with every AI deployment.

In 2011, PG&E introduced Oracle’s AI-powered Opower platform, providing residential customers with personalized energy reports. A version specifically tailored for solar users was launched in 2023. Through the Solar Home Energy Reports, PG&E’s solar customers receive insights into their solar production, grid usage, and consumption patterns, helping them make more informed choices about energy use.

The Opower platform uses KWh energy readings from smart meter data to detect certain customer-owned appliances, generate a load profile, and automate the customer’s experience. In addition to alerting customers of bill assistance eligibility, Opower can also share rate plan recommendations, including off-peak electric vehicle charging. PG&E achieved a 36% higher open rate from EV customers who received Home Energy Reports compared with traditional communications.

Beyond customer engagement, Oracle is also bringing new AI insights to grid applications. Utilities can use the Oracle platform to track EV growth in their territory, confirm that customers are mapped to the correct transformer in their system of record, and identify overloaded transformers. As these insights become more mission-critical, building trust with utility clients means ensuring human oversight and transparency in every AI application. “You need to have that human in the loop at all times,” Strautman emphasized, pointing out that human involvement is essential to adapting AI insights to the complex, real-world conditions of utility operations.

Beyond customer engagement, utilities can use the Opower platform to track EV growth in their territory, confirm that customers are mapped to the correct transformer in their system of record, and identify overloaded transformers. (Courtesy: Oracle)

Oracle’s approach is anchored in the idea of “human-centered AI.” Rather than replacing human decision-makers, Oracle’s tools are designed to complement them, particularly in high-stakes applications like grid operations. Strautman adds, “Imagine we’re trying to predict solar or EV usage and we get it wrong.” That kind of mistake could lead to outages or overloaded transformers.

Oracle is also aware of the caution utilities exercise when considering AI, especially as errors could disrupt essential services and erode customer trust. Strautman said that Oracle’s AI solutions are developed to incrementally build confidence over time.

Governance-first

PG&E’s “governance-first” approach is central to its strategy, as detailed in the utility’s Innovation Report and covered in POWERGRID International. This approach emphasizes strict oversight and incremental testing, a structured path that lets the utility experiment with AI in areas where it can drive efficiency without compromising safety. Given the high stakes of utility work, PG&E is clear about its priorities: AI applications need to be as reliable as the infrastructure they support.

The governance framework PG&E uses for AI also includes regulatory alignment, which is crucial in California’s tightly regulated utility sector. California’s data protection and safety requirements shape how PG&E deploys AI tools, ensuring that all applications adhere to data security, bias prevention, and compliance standards. Before any tool leaves the sandbox, it undergoes a comprehensive review to assess these factors, particularly since AI models frequently work with sensitive customer and operational data.

Data governance, it turns out, could be the most important stage in a utility’s AI journey, according to IBM’s global energy and utilities lead, Casey Werth. “It’s so boring,” Werth said, but without a solid data governance foundation, “none of this is possible.”

Werth spends many of his days crisscrossing the U.S. to meet with utility executives about their digitalization and AI ambitions. Most are still in the “figuring it out” stage, he said, with directional consensus and data quality being among the chief challenges.

“We’re a year into the real hype-cycle,” Werth said, “and my observation is that utilities are still grappling as to how to reach their dream outputs from AI tools.”

For now, Werth advises utilities to focus on low-risk AI applications that can provide value without the need for extensive data restructuring or large-scale transformation. He points out that machine learning models that analyze satellite imagery or AMI data disaggregation for asset mapping can offer valuable insights with relatively low risk.

A utility data ‘blackhole’

IBM has partnered with Microsoft on the AI asset management platform Maximo and is making significant investments to further integrate its cloud-based AI and data platform, watsonx, across the company’s software applications. The AI goldmine for utility software providers, Werth believes, is asset health and maintenance.

Werth said many utilities have a backlog of asset failure data that’s poorly documented or inconsistently labeled, creating obstacles for predictive maintenance initiatives. This data is essential for AI-driven predictive maintenance, which relies on accurate historical records to detect patterns and predict when and where future failures might occur. However, without clean, standardized data, utilities face an uphill battle in moving to advanced predictive analytics for asset management.

Werth points to the specific example of root cause analysis for asset failures, which he describes as a “blackhole” in most utilities’ data environments. Even when field teams document failures, the information is often incomplete or lacks standard terminology, making it difficult to analyze consistently. One utility Werth spoke with tried to standardize the process by providing drop-down menus with preset failure reasons; however, over 80% of responses were logged as “other.” This inconsistency poses a major barrier to effective AI implementation, as AI models need well-organized data to function accurately.

“Things break because of very poorly categorized data sets within utilities,” Werth said. “Quality data is a must-have to move to predictive analytics.”

To address this, IBM is working with utilities to develop streamlined data entry and organization processes, aiming to make data more usable for AI applications. For instance, IBM has suggested providing field teams with ruggedized devices equipped with AI-powered language models that allow them to document asset failures verbally. The approach could help standardize asset failure documentation by converting spoken language into structured data, making it easier to analyze over time and improve predictive maintenance outcomes.

These AI use cases don’t directly touch grid operations, and the utility industry is unlikely to arrive at that stage for “probably quite some time,” Werth said. But the tools can influence long-term resource and system planning, climate adaption, interconnection queue management, and asset management, with the potential to provide substantial value to utilities.

Werth said AI vendors have a responsibility to clearly communicate that value proposition to utility leaders, who often require broad organizational buy-in to advance new technologies. Successful AI deployment requires a close alignment between IT, which understands the technology, and business units, which understand the operational processes.

“In certain utilities, we get this perfect moment of teams working together. It’s finding that marriage of understanding of the technical requirements,” Werth said.

Getting ‘forever chemicals’ out of the chips race – This Week in Cleantech

This Week in Cleantech is a podcast covering impactful stories in clean energy and climate in 15 minutes or less, featuring John Engel and Paul…

Emergency powers to restart coal plants? – This Week in Cleantech

This Week in Cleantech is a weekly podcast covering the most impactful stories in clean energy and climate in 15 minutes or less featuring John…