Big Data = Big Challenges for Big Substations

Intelligent and Secure Sensing at the Edge

Z1311upiumaf03

Intelligent and Secure Sensing at the Edge

By Brett Sargent

Steadily growing electrical demand combined with aging substations and an aging workforce is placing extreme pressure on our energy infrastructure.

Power consumption continues to grow an average 3 percent to 5 percent per year globally. Many substations in North America and Europe are more than 50 years old and running above nameplate ratings. Up to 10 percent of the power we generate today is lost in transmission. The utility workforce, with their decades of expertise and knowledge, is aging and retiring. What you see is a perfect storm developing.

Z1311upiumaf03

The energy grid is being steadily improved, but it's immensely expensive and will take decades to accomplish. In the meantime, faced with the threat of more frequent power outages, utilities must focus on optimizing existing generation and transmission and distribution systems-keeping them longer and driving them harder.

What utility operators need is a smart approach that will lead to SLEx-Substation Life Extension. While smart meters give utilities feedback on customers' consumption and usage patterns at the endpoint, they do little to ensure the reliability of the energy supply chain. What's required for a truly smart energy grid is intelligence across the entire infrastructure, including generation and distribution.

Sensor technology is a critical component of SLEx. The value of continuous online sensor monitoring on transmission and distribution systems-both existing and new builds-is well documented: asset optimization, the ability to conduct condition-based maintenance and detect component failure before it occurs, and safe dynamic loading.

Luma Sensortechnologies

According to a December 2009 Electric Power Research Institute (EPRI) study, several sensors are expected to come on line to enable a smart transmission system.

Most utility operators are familiar with online dissolved gas analysis (DGA), bushing monitoring, winding hot spot monitoring, top oil temperature monitoring, partial discharge monitoring, load tap changer (LTC) monitoring and SF6 circuit breaker monitoring.

A wide array of current transformers (CTs) and potential transformers (PTs) are located throughout substations. Older substations typically have fewer sensors installed, including old-style analog gauges and sight glasses that require someone to physically inspect and gather log information. Older substations also are burdened with poor communications infrastructure that will facilitate large amounts of data being sent back to a central location. These older substations are good locations for sensor upgrades for asset and substation optimization. Fortunately, many sensors now used in industry can be retrofitted onto aged assets without much difficulty. A few embedded sensors-such as fiber optic temperature measurement for winding hot spots-can only be installed on new transformers or on those that have been overhauled and refurbished.

Analysis of gases produced in the insulating oil inside transformers is one of the most important elements of monitoring transformers and reducing outages caused by faulty LTCs. By putting DGA monitors online, utilities can continuously monitor these conditions through PC or tablet vs. the sporadic and time-consuming lab analysis protocol currently used by many utilities.

Thermal infrared imaging cameras are another monitoring tool in electricity generation. When used in coal-fired plants, these advanced imaging systems let plant operators identify problems inside the boiler before they lead to outages.

While sensors give operators the insights they need to make better decisions and automate swift, intelligent responses to real-time conditions, they also generate large amounts of data. Imaging and camera related sensors produce frame rates-sampling frequencies-of 30 to 60 frames per second (fps). One imaging camera can generate up to 500,000 bytes per scan, at 60 fps, resulting in 27 million bytes per second.

With all the sensors that can be located in a substation, including continuous online thermal imaging and security/perimeter monitoring, which is the most data intensive, there can be up to 5 Gigabits per second of data streaming out of the substation. That is, if the configuration allows all data gathered to be sent outside the substation. This is where big data can introduce new levels of complexity, including the need to move data around safely and securely.

Z1311upiumaf01 Rev

The Challenge of Big Data

Some data transmission strategies involve continuous streaming of sensor data back to servers located at a centralized control station, where analytics can be performed to help operators make informed decisions. Other strategies involve cloud-based computing-remote data storage and analytics at an indeterminate location, which presents security risks and bandwidth issues. There are also numerous topology strategies that come into play to safely and reliably transmit data.

In the typical data flow scenario, where analytics and processing take place at utility headquarters or a processing center location, several variables must be considered:

• Cost,

• Bandwidth,

• Security-NERC CIP,

• Stability/reliability,

• Ability to integrate, and

• Size of network.

Processing of data in the cloud or on a server at a central processing location each has advantages and disadvantages. But, one thing is clear-the farther away from the sensor head data is processed, the higher the risk of something going wrong.

Z1311upiumaf02

Intelligent Sensing at the Edge

The latest trend in sensing is to enable the maximum amount of processing capability at the sensor head itself, referred to as sensing at the edge. In this scenario, intelligent sensors perform analytics, store data and provide feedback and action alerts to all assets locally, avoiding excessive streaming of data back to a central location.

Data that does come back is reported by exception, meaning it has been analyzed and determined to meet predetermined criteria before it's sent back to the processing center for notification. This strategy avoids unnecessary congestion and can reduce the overall tax on the bit pipe by 80 percent, allowing faster data processing capability, minimizing the risk of data interruption and ensuring users are not inundated with data. At the same time, access has to be given to all the data for analytics and forensics, when needed.

Intelligent sensing at the edge enables SLEx by:

• Making big data manageable,

• Saving unbearable stress on the bit pipe,

• Minimizing failure mechanisms and security risks,

• Preventing operators from being inundated by big data, and

• Enhancing speed to feedback and minimizing decision time.

The smart grid is a work in process, requiring much more than intelligent meters at the end point. By using sensor technology throughout the power infrastructure, modernization can occur quickly and cost effectively, and bring the smart grid to reality.


About the author: Brett Sargent is CTO - VP/GM of LumaSense Technologies. He has more than 20 years of industry experience including leadership roles at DuPont, Lockheed Martin, Exelon Nuclear and General Electric, where he was global sales leader for the GE Energy T&D Products division. Sargent holds a B.S. in Electrical Engineering from Widener University, an M.S. in Nuclear Engineering from Rensselaer Polytechnic Institute, and an MBA in International Business from Georgia State University.

More in Tools & Supplies