Optimizing Your Grid of the Future Means Optimizing Your Data

Each month, UAI members gather to discuss the role of data analytics in grid optimization. This is a report out of recent discussions.

It’s not 2008 anymore, but there is something in the air in terms of utilities pursuing large investments in the electric grid. Earlier this month, PSEG announced that it would be spending $14 billion to $17 billion on infrastructure over the next 5 years, with as much as $10 billion allocated to grid hardening and reliability efforts. According to a 2018 Black & Veatch survey, 39.7% of utilities are currently integrating connected technologies into the grid to serve reliability purposes.

The Difference is in the Data

Data is a strategic aspect of many of today’s grid modernization initiatives. The “grid of the future” is one that gathers and leverages data from across the network—smart meters, fault sensors, phasor measurement units (PMUs), smart inverters, and other IoT—to optimize grid management, improve reliability, integrate DER, and provide better service to customers. The goal for most utilities that are ramping up data gathering activities is to leverage big data analytics to optimize their business.

AMI is becoming ubiquitous in North America. Data from smart meters is being used to become more efficient in billing and develop a better understanding what is going on in the field. All sorts of other information can be gleaned from AMI data to drive predictive asset maintenance, identify areas of high risk and energy loss, and better engage customers.

For those looking to glean a better understanding of the grid, fault sensing devices and PMUs give utilities are a faster and more accurate understanding of grid conditions, whether it be at the transmission, sub-transmission, or even distribution level.

Distributed generation (DG), EVs, and even residential storage are commodifying, meaning that utilities also need to better understand the last mile of the grid and how these assets function and perform. And this is not just at a residential level—many cities and large customers are adopting larger-scale DG, storage, and fleets of EVs that can be used proactively by utilities to improve reliability, or can do just the opposite.

How Far Should They Go?

Utilities are gathering a lot of data from and concerning the grid, which is a great first step, but in getting all that data, there is also a lot of responsibility. Unfortunately, just gathering data does not go very far; the right tools need to be in place for employees to perform analytics on that data. For grid optimization analytics, engineers need data from systems across the enterprise—everything from devices and SCADA to customer information systems (CIS) and geospatial systems.

That is why so many utilities today are focusing on ways in which they proactively manage their data across the enterprise. This month, members of the Utility Analytics Institute Grid Optimization Analytics Advancement Group laid out a number of considerations and solutions for some of the challenges that are encountered in developing a successful data analytics strategy to support grid modernization.

Gathering and moving large amounts of data from systems that store grid data, such as AMI and SCADA, into a cloud-based Hadoop environment tends to produce a ripple effect with data across the enterprise. There are many other data sources that provide value to better understanding the grid, such as CIS, GIS, IoT, etc., that should probably also be integrated.

Like pulling the loose thread in a sweater, these processes are not as discrete as one might think. While in theory large data integration projects hold a lot of value, there is a lot of complexity that can slow or halt the effort.

Getting Data to Users

To be innovative and successfully integrate analytics into grid management, engineers and analysts need to access and merge diverse data for discovery and proof-of-concept ideas. One utility member stated that upon surveying their employees that used diverse data for analytics purposes, only 49% were able to perform the actual analytics most of the time—the remaining 51% spent most of their time accessing and cleaning data.

While many vendors state that their ETL and “crawler” solutions provide better accessibility to diverse data, a lot of utilities find that they are still limited by internal expertise that know how to use such tools—establishing a bottleneck in a best-case scenario, and altogether killing a utility’s desire to perform advanced analytics in a worst-case scenario.

A utility can have sensors and IoT installed across the grid and behind the meter, but ultimately if grid management teams don’t know what to do with that data, it can be rendered useless.

The engineers and analysts that manage the electric grid need tools that speak the same business language that they do, simple as that. As well, it does not hurt for them to have some transparency around which data is being used by their peers so that they can build upon previous work and lessons learned.

One member stated it loud and clear, “we want to expose data to the business, so have to do so in a business-friendly manner.”

Want to learn more about UAI Advancement Groups and Communities? 

The B&V study can be found here
A study on grid modernization funding initiatives can also be found here
Click here to see the most active states pursuing grid modernization. 

 

Related Articles