by Andy Bane and Paul K. Bower, Ventyx
Organizations across the power industry are excited about big data’s potential to address broad industry challenges and open opportunities for improving operations. These range from aging assets managed by an aging work force to better storm preparation and response to enabling energy efficiency.
Big data that pours in from the smart grid, social media, customer calls and online forms, streaming video and photography that provide real-world views of grid assets offer new opportunities for utilities to improve asset health, accelerate outage restoration and increase customer satisfaction.
Driving these solutions, however, requires more than merely managing big data. It requires mastering it to tie systems together in real time—bringing critical context to the volume and variety of data and making big data suitable for a wide range of uses within the organization.
Effectively harnessing big data is essential to leveraging it for business benefit, which means consolidating and then verifying it as an authoritative source of business-critical information trusted for use by employees and applications across an enterprise. Mastering big data means providing a utility with a single source of the truth to power improved decision-making to drive greater operational performance.
The Four V’s of Big Data: Volume, Velocity, Variety, Veracity
By itself, big data creates more challenges than solutions for utilities. It is also about the fast-growing variety of data and data sources, the increased velocity with which data is created and shared, and the trustworthiness or veracity of that data—all of which combine to make big data difficult to master to gain big insights without much manipulation and analysis.
Volumes of data are flowing into utilities. For example, massive amounts of equipment-condition data is generated every second by smart sensors, monitors and intelligent devices across the grid and across day-to-day utility operations. External data such as weather, credit and market data is available from many sources, and data on consumers such as data generated by social media is becoming increasingly important. Enterprise data is produced and consumed at a rapid rate by a growing number of applications, from enterprise asset management and mobile work force management to distribution management, human resources and customer-facing applications.
Much of this big data is unstructured, or at best semi-structured, including data pertaining to asset health conditions (e.g., photos, maps and experience-based expertise with specific assets). Much of a utility’s historical data can be unstructured, as well, which presents challenges around turning this data into structured results that provide actionable insights. In addition, as grid systems tie increasingly into energy markets, utilities are having to process and manage complex events, which further complicates the data management challenge. Meanwhile, new technologies such as electric vehicles (EVs) and distributed energy resources only will complicate matters (e.g., how will EV-charging data be collected and transformed into insights?).
All of this data, new and old, must be mastered to generate actionable insight.
Harnessing Big Data With Master Data
The best way to master big data is by taking a master data approach. A master data approach provides a central source of business data used across workgroups, systems, applications and processes. Master data is consolidated, authoritative and trustworthy. It has been standardized, de-duplicated, cleansed and validated. Its distribution often centralized through a management hub, master data provides a single source of truth to all stakeholders inside and outside an enterprise.
The explosion of data utilities are tackling comes from operational technology sources such as sensors and information technology (IT) sources such as enterprise business applications. It all needs to be put into context and verified for accuracy. By ensuring veracity, master data enables utilities to use their data—big and conventional, operational technology and IT, transactional and interactional—in a just-in-time manner to better manage the grid. Master data also is pivotal to leveraging external data, such as social network streams and weather information, putting the data in context so it is useful for sentiment analysis and other analytics.
Master data is pivotal to successful convergence of operational technology and IT within a utility. With operational technology and IT in alignment, functional and physical views of equipment are brought together along with real-time data from operational sources to feed real-time and historical business analytic solutions that drive real-time and predictive visibility. This enterprise visibility is the key to making better operational and strategic decisions.
As a result, master data will enable solutions to encompass multiple types of big data, which will enable utilities to deploy broad strategies and solutions that span numerous disparate systems and business processes.
In turn, big data from these systems can be leveraged to improve planning for outages, predicting equipment failures, responding to weather events, and optimizing the flow of energy and information across the network—mastering a data management life cycle.
Exploring the Data Management Life Cycle
Mastering the management of data along this life cycle of planning, predicting, responding and optimizing operations requires mastering numerous types of big data. But if done successfully, the possibilities for improved operations and decision-making are just as numerous.
Plan: Improving Storm Planning, Outage Restoration
When utilities prepare for potentially disruptive events such as storms, planning and operating models are put into play using data from previous years combined with current information on critical assets across the network. Many things must be determined, including likely areas of damage, the best location for materials, personnel planning and crew placement, services to shelters, and how to interact with municipalities and affected customers.
Big data has a proportionally big role to play. For example, the increasing use of drones to map transmission and distribution assets and the environment in which they reside is extremely valuable in planning even though it unleashes a flood of unstructured mapping data that must be managed.
Big data also can be used predictively, including data on asset health and network reliability. With this data, solutions can model more accurately and effectively the impact of storms, earthquakes or other disasters against the health of the overall network, including its structures and connections. Big data is there to be leveraged. As it is taken to the analytic environment, the data must be consolidated, matched and verified for quality and consistency so it is fit for use across end user applications. Progressive-thinking utilities understand this need.
For instance, Computerworld.com recently reported that Reid Nuttall, chief information officer at OGE Energy Corp., said the company has set up an information “factory” and business analyst competency center to find ways to leverage the data generated from thousands of smart meters to drive business value, such as reducing peak demand.
Predict: Replacing Reactive With Predictive Solutions
Predictive solutions such as asset health use real-time analytics to get ahead of potential or developing situations. Asset health solutions, for example, use real-time analytics, historical data, operating history, etc., to flag assets that are trending toward failure so action can be taken before failures.
Big data enriches this area. For instance, asset health applications analyze real-time sensor data on an asset’s condition and historical data on an asset and the expertise of those most familiar with maintaining a given asset.
American Electric Power Co. Inc. (AEP), one of the largest electric utilities in the United States, is implementing an asset health center to leverage big data in this way.
The solution eventually will be applied across all AEP transmission substations and integrate equipment-derived operational technology data with intelligent IT applications. Thus, it will bring together in a single system a range of disparate asset data, algorithms based on subject-matter expertise and analytic software to transform how AEP maintains its transmission infrastructure.
Big data also can enhance significantly the safety, accuracy and quality of actions carried out in the field.
Consider a field technician who is wearing a pair of safety goggles equipped with a heads-up display that provides information related to the equipment he’s looking at, including maintenance actions, procedures, notes, tags and even video.
Whether information is flowing into, out of or within an organization, master data management is at the core of transforming diverse big data into consistent information usable for predictive analytics, as well as in the field to put the results of analytics into action.
Respond: Reacting More Efficiently, Effectively to Events
The capture of unstructured data from sources also can accelerate and increase the effectiveness of a utility’s response to network events. This plays out in multiple ways, but here we focus on an outage management example. The analysis of huge amounts of weather data can help shape a utility’s response to fast-changing weather conditions. Valuable information about developing conditions and customer satisfaction after restoration efforts also can be harvested from customers via social network streams. Social media channels also can be used to broadcast or precisely target messages to customers regarding outage conditions and restoration efforts and time lines. Flyover data gathered from drones can be mapped against existing spatial data to show impassable roads, downed lines, flood areas and more to optimize restoration. When this is combined with network management and enterprise information technology systems, utilities can master big data to improve storm responsiveness significantly while automating many decisions typically reliant on multiple skilled workers.
This variety of information can be overwhelming unless utilities reconcile the information and transform the data for consumption by decision-makers and IT applications. CenterPoint Energy integrated business intelligence and analytics into its advanced distribution automation system. As a result, CenterPoint’s system enables complex data from disparate systems to be assimilated and presented to users in key functions from operations and customer service to resource and supply chain management to simplify decision-making.
Optimize: Streamlining Energy, Information Flow Across the Network
The effective handling of big data starts with demand response programs, where big data that has been mastered can be used to develop more accurate forecasts about customer-level loads for upcoming hours and days, as well as to help optimize the usage of demand response programs against current market conditions.
In smart meter analytics, data management routines can cull through 15-minute interval data, aggregate it to the substation level, and validate and further consolidate the information to make it usable from a control standpoint. Analysis of this data combined with customer profiling can help smooth the load profile to benefit customers. Going farther into the customer facility or home beyond the meter, large volumes of home energy network data can be leveraged to design and analyze demand response and energy efficiency programs.
A Fortune 500 utility with more than 20 years in direct load control programs recently implemented a full demand response program that empowers customers with rate structure options that allow them to control energy usage based on demand and hourly rates. It’s a big data play. Behind the curtains, the company upgraded its data management infrastructure with master data technology so the operational technology data created by smart thermostats, load control switches and the like is delivered in a trusted, usable form to its IT systems, including its customer information system (CIS).
Consequently, the CIS can deliver accurate daily and monthly forecasts, and the utility can provide the accurate information required to comply with market regulations.
Turning Big Data Into Big Value
Mastering the vast data available to utilities is more than a technology challenge.
If it were just that, then it would be solved by recent advances in big data processing (e.g., Hadoop) and storage.
Big data also is a business challenge that can be solved only by transforming big data into business insights.
Mastering big data is essential to bringing together the physical world of assets, personnel and customers with the world of systems, business processes and performance metrics. It is the glue for aligning operational technology and IT.
When a utility has mastered the four V’s of big data, it can harness the tide of big data to more effectively plan for and manage outages, better manage its asset base and capital expenditures, optimize the flow of energy and enable new levels of customer engagement.
Meaningful accomplishments are being made using big data, but mastering the data will enable utilities to drive a step change in efficiency, productivity, return on assets, network reliability, safety and customer satisfaction.
Andy Bane is executive vice president of product management and chief of strategy for Ventyx. He is responsible for overseeing product management and strategy and new product introduction. He has a degree from the University of Colorado.
Paul K. Bower is senior vice president and general manager for advanced business intelligence solutions at Ventyx, an ABB company. Prior to Ventyx, he was a co-founder and chief technology officer of Obvient Strategies, co-founder and vice president of technical services of CES, leader of the development team for a revenue and demand management system at Northwest Airlines, and a developer and system architect at Siemens.
PowerGrid International Articles Archives
View Power Generation Articles on PennEnergy.com