animind - Fotolia
In order to keep the lights on, electric utility companies need to keep their power-generating turbines spinning. One Arizona-based utility has turned to advanced analytics techniques to improve performance in this area.
The Salt River Project (SRP) provides electricity to the Phoenix metropolitan area and water to most of central Arizona, making it one of the nation's largest utility providers. Demands on the system's infrastructure are high, given that the region is one of the hottest and driest in the country.
To improve uptime, engineers turned to smart grid technology that monitors sensor data streaming from the generators. They can see when turbines are inactive, which allows them to send work crews out to do preventative maintenance. They also analyze this data over time to predict when machines are likely to require maintenance and when they're likely to be in use.
"We've got lots of days that we have access to those turbines that we didn't know we had," said Steve Petruso, a senior software developer at SRP.
This type of initiative is becoming more and more common in the utility industry. McKinsey & Company has estimated that smart grid technology will generate up to $130 billion in value in the U.S. by 2019. This value will largely come from reduced power use by consumers and less economic disruption caused by power outages. Utility companies are putting sensors in generating equipment, power lines and at the point of consumption in the form of smart meters, all of which is generating troves of data ripe for analysis.
Corby Gardinereconomist at the Salt River Project
The SRP team doesn't stop at monitoring its machine data and doing predictive maintenance. It also applies predictive analytics to forecast power supply and demand to ensure they will have adequate availability of electricity without over-producing.
To do this the team has built a neural network predictive model that compares historical weather data to previous demand levels. It then analyzes weather forecasts, recent energy usage patterns from consumers and data from generators to determine if the system should produce more or less power at any given moment.
"By getting more data points, we think we can better forecast our load needs," said Corby Gardiner, an economist at SRP.
SRP gets all of its data from an infrastructure service through OSIsoft that collects streaming data from machine sensors and makes it available to analytics and visualization applications. In this case, SRP uses SAS Analytics to do its predictive modeling.
The combination of the two tools allows analysts to pull data in small batch processing jobs that can be run as frequently as every minute. This allows the teams to have up-to-the-minute information with which to make decisions on whether to increase or decrease energy production.
SRP also plans to take advantage of the fresh data it collects from its smart grid technology to manage transmission regulations. Under federal law, energy generators cannot push out more power than stipulated in transmission agreements. Federal regulators enforce these rules by monitoring hourly transmission averages. By having minute-to-minute data, engineers can ensure SRP doesn't exceed these averages.
"Previous to this, they were kind of at the mercy of looking at hourly data, which wasn't really useful," Gardiner said.
Everything IT managers need to know about smart grid technology
SAP tool helps power smart grid development
BI technology could play important role in future smart grid development
What is smart manufacturing?
Dig Deeper on Advanced analytics software
Ed Burns asks:
How do you think analytics techniques will influence the development of smart grids?
0 ResponsesJoin the Discussion