Growing Analytics to Meet Current and Future Needs: NYPA’s Approach

NYPA is committed to becoming an end-to-end digital utility. At UA Summit 2021 they laid out how they developed their data analytics strategy, process, and architecture.

The New York Power Authority (NYPA) has taken a disciplined approach to developing analytics capabilities and it shows. The company has done its homework, learning from the experience of utilities and other industries. It helps that NYPA’s CEO Gil Quiniones has set a high bar for the company. “Our goal is to become the nation’s first end-to-end digital utility,” he notes. Personnel are on board and the company is stepping up.

Membership in the Utility Analytics Institute has helped NYPA vet architecture concepts of data analytics.  At May’s keynote UA Summit Session, “Developing NYPA’s Future Grid – NYPA’s Data Analytics Journey.” The company’s vice president of digital transformation, Daniella Piper, and vice president of product development, data management and enterprise architecture, Ron Carroll, laid out how the company developed its data analytics strategy, processes and architecture.

Here’s what session attendees learned:

Data governance and data integration architecture solve the “data problem.”

As utilities will attest, 75 percent to 80 percent of the work required in an analytics project goes into preparing the data. For NYPA, these challenges were combining disparate data sources and formats, accessing data, ensuring data quality, and establishing traceability and lineage. Strong data governance is a priority for NYPA in achieving data readiness. That includes well-defined ownership, processes, identification of quality of data and business rules. Data integration architecture is built by using standardized frameworks to ingest/transform/publish data with pre-built integration for key applications.

Use what works best, building only what you need.

NYPA takes advantage of what can be provided by third parties as well as in-house resources. This is a conscious approach. Where there is little in-house expertise, the company uses third-party analytics platforms. Of note is NYPA’s use of GE for asset performance management. Company management also believes it is simpler and less expensive to use out-of-the-box business intelligence (BI) and analytics tools that come with existing applications (enterprise resource planning, enterprise asset management, data historians, etc.). Where multiple data sources and a high level of model customization is required, NYPA has built its own.

The analytics platform provides a place for everything.

The NYPA Analytics Platform (AP) is a one-stop shop – a place to access data and perform analytics. The platform consists of an Enterprise Data Hub (Hub) and an Analytics Area (AA). The Hub ingests, validates and transforms NYPA application data, third party vendor data and “bring-your-own data (BYOD)”. The Hub was built to provide a “consistent framework for integration, meta data capture and data quality, all performed in one place.” With permissions, users can access Hub data and visualization, and third-party analytics applications use vetted data from the hub. There is a separate storage area for sensitive data such as personal identification information (PII) and personal health information (PHI).

The AA is where the “magic happens.” This is where data scientists can do data discovery and access tools to develop, test and deploy new models. Capabilities include artificial intelligence, such as machine learning, and predictive and prescriptive analytics. One example involves current plans to move from descriptive analytics to predictive analytics in order to provide energy traders with an early warning of changes in location-based marginal pricing in the NYISO market.

Two things are unique about NYPA’s approach. First, third-party vendor analytics platforms use data from the Hub AND provide results back into the AP with potential to develop new models using new data sources. Also, the AA imports vendor models and algorithms from third parties. According to the Ron Carroll, this set up enables “speed to market, using an out-of-the-box model, and reduces data movement.” It would be interesting to learn whether this approach includes using open-source models.

Well-curated data sets and models prepare NYPA for the future.

If not handled correctly, analytics projects can proliferate data and models that may be hard to find, redundant or contradictory. NYPA is focused on bringing in “data required to serve the analytics use-cases but one that enables data gravity over time through well-curated and managed datasets.”

It’s not just about making it easy to re-use datasets. The platform is built to allow data scientists to build and share models and algorithms. Data scientists will probably want to dig deeper in order to learn just how models are organized for easy access.

From an overall perspective, NYPA is well-positioned to meet future challenges. The flexibility to continue to add use cases is essential to the company’s approach.  Moving forward, NYPA also is setting its sights on edge computing in order to reduce bandwidth costs and reduce response times by performing locally.

 

Related Articles