PG&E’s Agile Asset Analytics Teamwork
Jen Owens of Pacific Gas & Electric’s Digital Catalyst Group discusses how agile product development teams at PG&E have developed tools to optimize asset management via data science and advanced analytics.
How did your agile product development team at PG&E get started?
“Find what we can do better with the data we have now” was the charge one of my leaders gave me when I took on my current role, developing a new electric asset management solution.
The value of that leader’s early charge for our team to utilize the agile approach which we took, is clear, looking back now over the success of our work this year, developing a system tool for asset risk-enabling advanced analytics. Why? Because there will always be no shortage of problems to solve, analytics to be done, and data to be mined, but the greatest value comes from answering the most difficult questions early and often. I think back often to that leader’s charge any try to encourage others to focus on what we can do better now, and it keeps us focused precisely where we needed to be. The most difficult business questions always remain: “What has value?” and “What do we do first?” Answering those questions and testing new hypotheses about the answers continually provide our teams with focus, momentum, and direction.
In order to develop a new asset-related system tool, we developed a line of business and IT partnership to tackle data and analytics as a part of PG&E’s Digital Catalyst group. The Digital Catalyst approach combines design thinking, agile development and co-location of business and IT resources to “Build the right thing, build the thing right, and build it together.” It is a groundbreaking approach for a traditional utility and allowed us to co-create the analytics and O&M prioritization-management foundational tools and data sets that were needed.
PG&E’s New digital tools are key to its transformation into the utility of the future, because they will allow us to adapt to a changing energy landscape, incorporating more renewable generation, opening new market opportunities and enabling customers to be more involved in their energy landscape.
As a result, we have moved our asset-related system tools from the “R” to the “D” in our Research & Development work, transforming findings from prior proof-of-concept phases into products that provide real operational improvements and value.
The tools we developed utilize data and analytics to enable better asset management actions, and use information that flows between PG&E assets, personnel, and processes. While the accomplishments specific to that project are valuable and important it is also worthwhile for us to share a wider set of lessons learned along the way. These lessons can be applied to analytics work in all areas of a utility, whether asset-centric or not.
Let’s talk about the wider lessons as well as the asset-centric project. First, could you tell us more about the Digital Catalyst group and the agile development approach that PG&E took?
In 2016, based on all that we had learned from numerous implementations of various technologies and products and software systems in the past, we saw how important it was to interact with our internal customers to build solutions together, with active involvement from personnel who will use the product.
The agile development teams at PG&E have the innovative and energized feel of a start-up company. While some may say part of the reason is due to our proximity of Silicon Valley, there are many factors at play. For one thing, PG&E is a hotbed for utility industry innovations, with one in five of the electric vehicles in the U.S. registered in our territory, and with more private solar than any other utility in the country. In addition, PG&E serves a population of more than 15 million by utilizing sources that emit no greenhouse gases for 60% of that energy, and is responsible for being the provider for energy infrastructure for 2/3 of California, the 6th largest economy in the world. Our customers lead the country with their fast adoption of technology, and expect no less from PG&E.
Agile product development teams have, in my view, brought something to PG&E that is more important than all these external demands: community. Fundamentally, value has been driven by the culture and sense of community that exists between our core of data scientists, and in the relationships we develop with the internal customers our product development teams serve. These teams have brought an energy and innovativeness while co-developing tools with our core users.
An organization chart cannot communicate the relationships that have been so vital in this work so far. It does not matter whether data scientists “reside” in a specific business unit (which some may have called a “silo” in the past), or whether they are on some part of a hub and spoke model on an organization chart. The reality, in terms of what really matters, is much softer. Ultimately what matters is that you create a group of peers who are really into the work they are doing, and are committed to sharing their passion with others.
“I have friends working at Google, Facebook and startups who say that they wish they were doing something as important and interesting as what we are doing here at PG&E. They see us using data science and advanced analytics too”.
There were some interesting references in documentation of your work regarding Machine Learning at PG&E. How would you describe machine learning, and what challenges do you see in implementing it?
Machine learning, training algorithms to make predictions or perform key actions based on data, offers the prospect of helping utilities address many concerns of our time. About half of PG&E’s 20,000-plus employees are involved in O&M and other work in the field
Consider how much knowledge exists in our workforce, including employees with many decades of operational experience across assets and systems and relationships built over 30 or more years. Consider the benefits we can leverage by deeper utilization of data and machine learning to build optimization models and other tools to inform new solutions. The expertise of Operations personnel is astounding, in terms of their ability to look at a situation and determining the right course of action, and is something they’ve learned to do very naturally over many years. It is a very high bar, analytically, for data scientists to try to develop systems with databases and algorithms to emulate this.
Was the strong team culture associated with your agile approach of benefit for advanced analytics initiatives like machine learning?
It may seem surprising, but yes, our experienced employees have such a depth of understanding, and it takes quite a long time for people to develop trust in the algorithms and the data, coming from a system where they trust each other and their experiences. Training algorithms to capture even a portion of that understanding requires iterative processes, involving long-term collaborations between our Operations experts and our data scientists.
Our teamwork between data scientists and operations personnel are at the heart of ‘Digital Transformation’ – it’s where the numerical meets the personal. Collaborative work environments that bring together people who have insights from different fields is something I value and enjoy, and this collaboration has helped us tremendously in 2017 to visualize complex data and convert it into tools for action and decision-making.
Looking ahead, these teams and collaborative tools will help us address the continually accelerating pace of change. As we do so, one of the great remaining hurdles is getting our business stakeholders acclimated to the speed of change, and enabling business processes to change to use operational analytics.
As we continue to build our analytics community and tools, an added benefit is how it helps to attract top talent to come work with us, and in our retaining that talent.
Utilities often think themselves “stodgy,” as if we are like the slow-moving Eeyore character from Winnie the Pooh, but it doesn’t have to be that way – and within our agile teams at PG&E, nothing could be further from the truth. I have friends working at Google, Facebook and startups who say that they wish they were doing something as important and interesting as what we are doing here at PG&E. While we are using advanced analytics to design solutions to our world’s energy problems that really matter, they are designing the next solution to deliver somebody their lunch. Utilities should feel bold in the value of this mission, and use it to their advantage in today’s competitive workplace and changing energy landscape.
And what would you like to share with us regarding details of the system tools developed as part of your asset-related project?
When I joined as lead for PG&E’s asset-related system tool project, we had already been through a pilot phase, so the challenge was to convert what was learned in that phase into real operational changes and value. The Proof of Concept included four asset types:
- Distribution poles
- Primary overhead conductor
- Distribution substation transformers
- Distribution breakers
It also included building a web application with unified asset data sets from multiple sources (tabular, graphical and map-based visualization of the assets).
We learned during the pilot that the asset-related solutions available on the market could only satisfy some of the objectives we desired from the visualization and support system we were after. So, we decided, to build the capabilities we wanted, we needed a strong focus on the underlying data and on the analytic capabilities. In 2016, PG&E’s new Digital Catalyst organization, launched an award-winning mobile product, Asset Inspection.
The Digital Catalyst approach combines design thinking, agile development and co-location of business and IT resources to “Build the right thing, build the thing right, and build it together.” It is a groundbreaking approach for a traditional utility, so we used this approach to partner Electric line-of-business people with Data & Analytics IT folks and co-create the analytics and foundational asset related system tools and data that were needed.
In early 2017, we joined forces with the PG&E’s new Digital Catalyst organization. As part of our project we built algorithms at both an individual and aggregate asset level (substation, circuit, sub-circuit, or asset-type), with the ability to perform low level of “what-if” analysis (e.g., weighting scenario-related factors). The system also demonstrated how algorithms can be modified for “what-if” analysis (using R language), and provided the ability to prepare some user-defined reports/queries and export results.
Co-creating the solution with the Data and Analytics IT folks, the resulting solution has enabled us to use data and analytics to enable better asset management decisions, and to target inspections, replacements, and other O&M activities based on key priorities.
Our team co-location and agile development led to rapid establishment and use of PG&E’s new cloud-based analytics platform. Data science runs that once took days, or were not even possible to run on desktops, can now be run in seconds. Such speed is crucial to the agile approach, which requires fast iteration and learning to develop ever-improving models and products with time, including data labs and analytics capabilities with cloud infrastructure, supporting data science notebooks using Python, Apache Spark, and rapid application development.
Thanks for a very interesting discussion Jen!
You’re welcome!
At PG&E, Jen Owens leads development of internal software, data analytics and data science for Electric Asset Management.
In her role as a member of PG&E’s Digital Catalyst group, Jen is leading the transition from traditional methods and software to new cloud infrastructure, data science and sensor data analytics.
Jen Owens has a BS from MIT, is a Certified Scrum Product Owner (CSPO), and holds a Master’s Degree in Renewable Energy and Grid Integration, which she studied at Carl von Ossietzky Universität Oldenburg (Germany) and University of Zaragoza (Spain). Her thesis was focused on Grid-Scale Energy Storage at the Centre for Hydrogen and Fuel Cell Research at the University of Birmingham (UK).