Like many utilities, Rappahannock Electric Cooperative (REC) purchases more power than it sells. Some of the discrepancy is attributed to system losses — commonly known as technical losses — due to energy dissipated in conductors, transformers and other equipment on transmission and distribution systems. But could some portion of the losses be attributed to other factors, such as theft? Could these system losses be reduced through better processes and system improvements?

These questions led to a detailed data analytics project aimed at providing answers to these longstanding challenges. The goal was to gain an accurate and in-depth understanding that could help better manage costs for members of this Fredericksburg, Virginia-based electric cooperative.

Building a Model

A digital model was crucial to understanding REC’s complex transmission and distribution system. Large volumes of data were needed for the model to create simulations of load flows that accurately represented correlations among loading, load flow and losses within the system.

Detailed analyses of loss reports, voltage values and precise loading ranges were required for each system node. As regularly refreshed data became available, it was uploaded to visualization programs to assist with validation methods and provide feedback that would lead to improvements and efficiencies.

The team used computational platforms to derive key data associations within the model. This was a necessary step to manage the massive volume of data that was being fed into the model and helped yield insights into critical associations. This was especially helpful in instances when actual data from circuits and other sources were not available.

Engineering and Data Science

With a system-level model ready, the team turned to integrating several large and previously disjointed datasets. A full range of system operational and control data generated by the supervisory control and data acquisition (SCADA) system was a critical foundation. Then additional data from automated metering infrastructure (AMI) and metering data management (MDM) systems was overlaid for additional insights in a technical loss analysis that identified loading of the electrical infrastructure at specific periods. This provided a picture of the varying ranges of voltage flow over the distribution circuits systemwide.

The data science piece focused on providing the deeper insight needed to understand potential causes of system-level losses. It was understood, prior to the project, that several factors could be driving these losses. For this step, a deeper investigation was required, using advanced analytics algorithms designed to explore the relationships between the SCADA, AMI and MDM systems to detect whether losses were due to conditions such as impedance, or from unmetered loads or possible theft.

The models built for the engineering analysis to simulate the network helped the data scientists gain understanding of customer profiles and the different circuits that fed power to certain large commercial customers or residential areas. Based on the models, algorithms were built to help analyze correlations.

Purchase to Sales — Understanding the Difference

The essential question for the team was: What is driving usage jumps? If AMI readings exceeded average levels of kilowatt-hour usage, it was a clue to dig deeper. If it was a residential meter, could the spike have been caused by electric vehicle charging? If commercial or industrial meters showed unusual power consumption, other possible explanations were explored.

The team reviewed billing data from wholesale energy purchased at the transmission delivery point and compared it with cleansed retail billing data. This comparison was a key step in analyzing how much energy REC was losing on its system. From a billing perspective, the assumption was that unsold energy is considered a loss to REC.

Accounting for Unmetered Loads

Data generated by AMI and SCADA feed algorithms to identify locations where energy losses are occurring. The algorithm uses AMI data generated for each circuit — along with power flow estimates from the engineering technical study — to create estimates of the data that is likely being generated by the SCADA systems. These loading estimates are incorporated into an algorithm to further refine the portions of energy load that are not accounted for by metering data. These estimates are derived by other methods as well, including AMI to phase matching, AMI to transformer associations, and other methods aimed at approximating the source SCADA levels.

It is typical for utilities to have contracts and rates for customers that set up monthly or other periodic billing cycles based on kilowatt-hours used instead of energy consumption on an hourly basis. For this project, this was a challenge because it required formulating a method to accurately estimate the losses and loads for given periods of time.

Actionable Insights

The project has led to creation of digital dashboards providing data in near real time. These tools provide REC with many actionable insights into the different functions of load and losses indicative of its system and is resulting in alignment of core systems for a precision approach to prioritizing and reducing losses in the future.

The project involved stepping back for a macro view of some basic questions such as: What is the correlation between the AMI and SCADA? What does the line loss look like? Ultimately, the goal is to make all the puzzle pieces fit so a complete picture emerges.

The result was a detailed understanding of how and when losses were occurring, as well as a granular view of system performance from delivery point, substation, and circuit at various times that could provide clues to help solve the puzzle.

Those calculations are driving the insights into many factors that will sustain prudent management of wholesale power costs at the cooperative electric utility for member-owners.

 

A data-driven approach can help utilities make better strategic decisions on capital assets that must be prioritized for replacement in today’s era of increased system demands.

Read the Blog

by
Carol Bogacz is an associate data scientist at Burns & McDonnell. With more than 25 years of experience analyzing data, Carol specializes in utilizing advanced statistical methods to identify insights that enable data-driven decisions. She has experience applying machine learning and advanced analytics techniques to a variety of business problems such as investment planning, load forecasting, operational efficiency, and outage management.