Beyond outages: Using reliability data

A utility’s reliability statistics do more than describe how many outages it has had or how long customers have spent in the dark. By diving into the data — and making it actionable — public power utilities can identify trends, prioritize infrastructure investments, and improve customer relationships.

Identifying patterns

City Utilities of Springfield in Missouri takes a quarterly look for areas or customers that have been out of power either three times or 10 hours in a year. These are referred to as “3-10” customers. 

“Playing around with the numbers, we have found pretty consistently that’s when people believe they have unreliable service,” said Brent McKinney, director of electric transmission and distribution at City Utilities of Springfield. “Many times, when people call in, it’s when people have had their third outage ... [However,] a lot of people are mad but don’t call in. This helps us to proactively reach out, instead of them being mad enough to finally call us.”

Once the utility pulls this data, Nathan Bruns, administrator of electric T&D reliability in Springfield, analyzes why these areas might be more problematic and then plans for special jobs to make improvements. Bruns also proactively contacts customers in those areas to let them know of the work being done and why.

Bruns has a background in geographic information systems, and so he will visibly map out the data to see if there are any hot spots or problem areas. “It helps to find those cluster areas a lot faster,” he said. 

“We do statistical analysis first, then allocate our manpower,” said McKinney. “The worst thing you can do is do maintenance on a line that doesn’t really need it. When we’re doing tree trimming or maintenance, we always look at the SAIDI or SAIFI of the feeders and [concentrate on] the worst first. We may find that finding the clusters and fixing the clusters will change the whole dynamics of the area. Find those hot spots and work on those first.”

Jackson Center Municipal Electric System in Ohio also looks for patterns to prioritize maintenance. “If we have a section of town that we have multiple issues with the line — uses, multiple arrestors, etc. — that’s when we’ll take it from our infrastructure budget to get the whole line replaced,” said Bruce Metz, village administrator for Jackson Center.

“What we’ve found is the macro look at your reliability data about how you or your substations are performing is great, but the more you can narrow it down, the better,” said McKinney. “It’s the old 80/20 rule: 80 percent of your problems are coming from 20 percent of your system.”

City Utilities also maps out areas with more tree coverage or that tend to have more animal contact to try and get more wildlife protection or tree trimming in those areas. 

“Little stuff like that helps with the big picture. Really fixating on those smaller areas, taps with maybe only 30 houses on it, will really drive your numbers up,” said Bruns. “Those small outages can really make a big difference if you are a small company like us.”

Getting the right information

When it comes to making data actionable, sometimes it is a matter of how a utility can cull through the data it has to make sense of it. 

“Getting distracted by everything we could possibly collect about customers or equipment doesn’t help us be more reliable,” said Alex Hofmann, director of energy and environmental services at the American Public Power Association. “We need to understand what is the most useful in making the best investments in reliability and ensure that everyone has access and the ability to collect that information.”

The village of Jackson Center used to experience numerous momentary outages, said Metz. To find out more information about what was causing those momentary outages, and to plan and track improvements, the public power utility subscribed to the American Public Power Association’s eReliability Tracker service and hired a firm, Exacter, to take annual readings of equipment. 

“It’s actually very simple. Really, all we do is look at our list of outages, where they are at, and what causes them. Then what Exacter does is tell us where outages might be coming,” said Metz. “That has helped us prioritize our different pole lines that we are changing. We use it as a guideline to see which section, town, street is next to be upgraded.” 

This kind of data collection and analysis doesn’t require investment in sophisticated technology. 

“You don’t need [advanced metering infrastructure] to try and meet a certain expectation for your customers; you just have to have some way to siphon through the reporting of your outages, and take that information and put it to good use,” said Bruns.

“The hardest thing we had to do was work through the procedures, how were we going to look through the data,” said McKinney. “We started first with spreadsheets and kept going from there. Figuring out what you want is the hardest part. It is humbling, too.”

Hofmann noted the importance of having a data policy in place to help support trusted data collection. And whether data is entered manually or collected from smart meters, utilities should check data sets regularly for common
errors — from rounding to how accurately meters are mapped to circuits and protective devices.


Collecting reliability data is also about putting it into context for a variety of stakeholders. “It is not easy to understand reliability data,” said Hofmann. “Even when boiled down, it is not easy for people to understand or to communicate. This is where benchmarking comes into play.” 

Hofmann gave the example of how a utility’s SAIFI can decrease but SAIDI remain the same, which will mean that CAIDI will increase. Explaining what drives these numbers can be helpful, because “people won’t understand that you can see an improving trend that looks like you are getting worse per customer,” said Hofmann. 

Hofmann said that utilities doing benchmarking should seek data sources that focus on the same information and criteria of the utility’s data. He also advised that utilities benchmark against utilities of similar size or in the same region, as it would be easier to show comparisons within similar weather patterns or conditions. 

Through looking at the data over time in the eReliability Tracker, Metz found that squirrels caused an average of about six outages per year prior to 2015. 

“Our council is very big on reliability,” said Metz. “They know that the closer we get to the [investor-owned utilities] in price, we have to do something to up the reliability.”

Metz also regularly gets together with leaders from other area utilities to compare data and identify trends. “We look at our own data, and if everyone is having the same issue, then we talk about what [each of us is] doing, [and] any other tracking on the issue,” he said.

Metz asked the group for advice on the latest wildlife protection strategies and got advice on effective squirrel guards. The utility installed squirrel guards and other devices on lines in 2015, and the average number of squirrel-related outages is now down to about 2.6 per year. 

City Utilities of Springfield uses both internal and external benchmarks to measure the utility’s performance. Each quarter, the utility tracks the percentage of customers that fall into the 3-10 report, which they have tracked for about seven years. The public power utility also looks at the national averages for metrics including SAIDI and SAIFI and has a goal to be at 80 percent of the national average or better.

Improving customer service

Metz shared how Jackson Center’s outages, squirrel-related or not, have decreased, and how the process has helped to eliminate many of the momentary outages. “Every time we flickered, I would wait for another phone call,” he said. Now, Metz said, he doesn’t get these calls. 

Customer calls have also gone down for Bruns. In addition to the outreach to 3-10 customers, City Utilities also conducts customer outreach after certain events, such as a feeder lockout, to give customers an idea about what affected the power supply. 

“On an individual customer basis, I’ll tell them what caused the outage on one day or another, to try to help them understand the system and what we’re going to do to fix it, or what we can’t do,” said Bruns. “Giving them the high level of what caused an outage has really helped. I used to get 15 to 20 calls about a feeder lockout, and now it … may spur some comments, but it has helped educate and keep some people at bay. They know there are some things that we just can’t control.”

Bruns also attends town halls to educate customers about how the grid is organized, how outages get reported, how crews are dispatched, and the statistics the utility tracks. For meetings in specific neighborhoods, Bruns starts off by presenting the overall statistics for the neighborhood compared to the whole system, good or bad. He also uses data to tailor each presentation to focus on problems specific to the community, such as how vehicle crashes in areas with high-capacity streets can lead to outages. 

“Most people generally don’t want to talk math. They just want to know what’s going on. We just tell them that [outages are] unsatisfactory. We never tell them what our average SAIDI or SAIFI is,” said McKinney. “The worst thing you can tell a customer is that you have overall good reliability — they care about their own reliability.” 

“I don’t know how much you can say the SAIFI or SAIDI really get impacted by this, but our customer satisfaction is definitely higher, which is always a benefit,” said Bruns. 

McKinney noted that satisfaction isn’t just measured in reduced call volume, but in increased ratings on customer surveys. Before going to work on areas identified in the 3-10 analysis, City Utilities ensures line crews have been briefed on the numbers so that they are prepared in case customers come out to talk while they are working.