How to Tackle Big Data

With more responsibility comes more headaches, utilities are finding.

Pennwell web 275 353

by Frank Hoss, Capgemini

With more responsibility comes more headaches, utilities are finding. With the growth of smart meters, utilities have the added responsibility of capturing and leveraging data to generate better decision-making. U.S. utilities are experiencing several headaches, and they’re falling behind their peers around the globe.

Pennwell web 275 353A recent energy efficiency ranking published by the American Council for an Energy-Efficient Economy (ACEEE) ranks the U.S. ninth among global energy leaders. According to the ranking, U.S. utilities have made “limited or little progress toward greater efficiency at the national level.”

This region’s utilities are at a crossroads. In 2011, the federal government took a special interest in the smart grid.

With $4.5 billion in federal stimulus grants and the implementation of aggressive renewable and energy efficiency standards, smart grid projects increased dramatically across the U.S. The grants focused primarily on infrastructure projects, such as adding equipment to modernize and improve electrical grid operation.

Smart meter programs along with other projects are creating exponentially more data (greater than 100-fold), and utilities are struggling to manage, analyze and use this data efficiently and effectively.

Cue the Big Data Challenge

Barriers stand in the way of utilities’ ability to successfully manage big data.

A recent Oracle study, “Big Data, Bigger Opportunities,” surveyed 151 North American executives at utilities with smart meter programs. It found that on a scale of one to 10, respondents rated themselves a 6.7 on their “readiness to manage big data.”

The same survey described challenges consistent with Capgemini’s big data study, finding that organizational silos and a dearth of data specialists are key obstacles that prevent big data from accomplishing its biggest objective: enabling more effective decision-making.

Data silos are a perennial problem that the business process re-engineering revolution of the 1990s failed to resolve.

The big data study noted that regulation and the emergence of trusted data aggregators might help break down today’s application silos.

A longer-term challenge is the lack of skilled analysts. Technology companies are working with universities to train tomorrow’s data specialists, but it is unlikely that supply will meet demand soon. Expect a talent war for the best data analysts.

Without the ability to use this data and lacking the personnel to do so, utilities are struggling to see the returns on their smart grid investments.

The original $4.5 billion in stimulus grants is slowly reaching zero, leaving utilities to invest additional funds of their own.

In a recent Black & Veatch survey, “Strategic Directions in the U.S. Electric Utility Industry Report,” 70 percent of utility executives said smart grid spending will slow when the grants run out. A bigger challenge has been utilities’ regulatory navigation.

The difficulty to get filings fully approved or supported substantially limits smart grid progress. Investments of this type must be supported by utility customers.

Some customers view grid modernization to improve energy efficiency as the environmentally sound choice, but laws to curb greenhouse gas production are enacted slowly.

Given how utilities manage big data, it’s hard to imagine executives would dedicate more funding toward big data.

Demonstrating big data’s value and obtaining the needed regulatory, legislative and public support for funding can occur only if utilities know what data to use and how to leverage it by evaluating smart meter programs and the amount of data delivered.

The Oracle study found that the two largest amounts of data being extracted are power outage (78 percent) and voltage (73 percent). Access to power outage data can yield significant results for utilities and customers.

For example, during a power outage, it is critical that a utility knows when and where the outage occurred and the likely faulted component.

Traditionally, utilities depended on customers to report outages, then employed an outage management system (OMS) to analyze the outage patterns and identify the most likely faulted component or components.

Smart meters give utilities ready access to data that tells them when power is out and restored in the distribution system at customers’ meters.

During major weather events, nested outages—outages within outages—typically occur, often accounting for more than 40 percent of customers without power.

As utility crews restored power, they spent more time driving the lines to identify remaining nested outages.

With smart meters, utilities can ping selected customers’ meters to determine if their power has been restored. This reduces the time customers are without power and resources utilities must spend to restore power.

This information, the loss and restoration of power, can be automated to inform customers when they are not at their primary residences during power losses. Beyond outage data, voltage information—available at the meters and throughout the electric distribution system—provides more benefits. Voltage varies across the distribution system.

Customer equipment has manufacturer-specified voltage ranges established for optimal performance and protecting the equipment from damage. Utilities use load-tap changers, voltage regulators and capacitor banks to help control these voltage variations. With voltage data readily available across the entire distribution system, utilities can control these devices better and in some cases make changes in the system to improve the voltage profiles.

Many utilities also are implementing distribution management systems (DMS). Using the DMS and the same voltage data, they can provide better volt/VAR control. Once utilities have the data, they must make sense of it and extract actionable insights.

First, utilities must develop their overall strategies and determine what is in use across their respective companies. Then, utilities must standardize core tools and solutions to optimize access and analysis of data.

Next, the focus on tools warrants discussion of data access, visualization and analysis. Each can yield results and provide long-term benefits to utilities and their customers.

Similar to other industries, utilities often found that the needed data and access to it wasn’t available or consistent at the enterprise level; it was trapped in organizational silos.

Technology advancements in data visualization and virtualization have overcome this. The work utilities do is visual: from the physical location of infrastructure, to the location of customers, to knowing where power outages and restoration crews are, to load and voltage profiles.

Data visualization is needed to identify the trends and problems across all of them. Data virtualization allows for source ownership of data where it should be, yet transparently brings the data together as needed to support analyses. It reduces the need or tendencies for organizations within utilities to replicate data, which can result in the “loss of the source of data truth” and different results for the same analyses.

Capgemini is working with BC Hydro to implement its Smart Metering and Infrastructure Program. The effort includes deploying some 2 million smart meters, the accompanying two-way telecommunications network and supporting applications including energy theft analytics and systems integration.

Data visualization has allowed the companies to easily and readily identify where the deployment is ongoing, identify where network performance tuning is needed based on the percentage of smart meter registration and to coordinate work efforts with other business units.

As more meters are deployed, BC Hydro will be able to use the meter data to visualize loading and voltage, as well as other parameters, across their distribution system.

Building on this, BC Hydro is pursuing a comprehensive enterprise analytics solution to provide corporatewide access to this big data, make sense of it and extract insights. Multiple vendors provide solutions in this area and typically focus on niche areas.

For example, companies such as Space-Time Insight focus on providing utilities with visual agility through virtual integrated situational awareness programs.

Companies such as SAS focus on utility analytics, and Teradata has developed a comprehensive smart grid data model and analytics business intelligence solution.

Utilities must approach data analysis with a clear understanding of what they are trying to learn. Then they can determine the best vendors for their needs.

For utilities to overcome the big data challenge, they cannot get overwhelmed by the talent shortage or exorbitant amount of data that is coming through their doors.

A simple, critical step that often is overlooked is making personnel aware of available data. Every utility’s long-term objective is the same: Deliver valuable insights to all stakeholders.

The key is knowing how to extract value from available data resources. No actionable insights will result if utilities don’t know what data to look at or how to leverage it.

Author

Frank Hoss is a principal in Capgemini’s smart energy utilities practice focused in the electric and gas utilities industry. Hoss has more than 26 years of experience working with utility clients. For six years, he has focused on the development of advanced metering and distribution smart grid solutions, working with clients to understand and quantify the operational, engineering and customer impacts associated with proposed smart grid solutions.

More Electric Light & Power Current Issue Articles
More Electric Light & Power Archives Issue Articles

More in Home