The New Power Quality: Changing Conditions Require a New Measurement Approach to Reduce Cost and Downtime

The term power quality has many different meanings, perhaps as many as those who attempt to describe its impact on system operation.

Th 0804up Thenew01

Power Quality: An Introduction

The term power quality has many different meanings, perhaps as many as those who attempt to describe its impact on system operation. The electric utility may describe power quality as reliability and quote statistics stating that the system is 99.95 percent reliable. The equipment manufacturer often defines power quality as the characteristics of the power supply, which may vary drastically for different vendors. However, the customer is the party ultimately affected by power quality related problems and the best definition should include his perspective. Considering each of these factors, the following definition is often used:

Power Quality Problem:
“Any power problem manifested in voltage, current, or frequency deviations that results in the failure or mis-operation of customer equipment.”
—extract from EPRI power quality study.

In the past, all power quality measurement could be characterized as voltage measurement. Power Quality events are Voltage Events, and early on, the measurement of Power Quality became a highly technical specialty.

Utility Power Quality departments specialized in customer service problems, and power utilities kept highly competent and experienced engineers on hand to deal with these problems. Considered no less important were devices used to create a profile of the customer’s load. These devices looked more at the information on the current side of the measurement to determine consumption and efficiency, demand, and performance. This was apart from, although related to, the measurement of power quality.

Th 0804up Thenew01
Click here to enlarge image

Today several events come together to make a whole new definition of power quality measurement, based on the acceptance of the high cost of power quality. It is estimated that three percent of every sales dollar in the U.S. is spent on power quality problems. The cost of power quality worldwide is estimated as high as a staggering 400 billion dollars per year. Three events coincide to create a tremendous need for far more power quality measurement and monitoring than the past.

First, the customer load changed. As the world became more automated, we demanded more speed and lighter weight and lower cost, old transformers were replaced with switching power supplies. These non-linear devices create a great deal of harmonics that lead to many problems including unbalanced systems, hot neutrals, and increased sags. Simultaneously, the same equipment, which incorporates microprocessors, is far more sensitive to power quality problems, and will fail more often due to power quality events.

Second, the power utilities began the process of deregulation, a forced break-up of the historic monopoly of generation, transmission, and distribution of electricity. This process of deregulation has caused turmoil in the industry, resulting in less money spent on maintenance, and fewer technical employees who have less time and far more responsibility than in the past.

The third factor is the recognition of the cost of power quality. The cost of lost production as you reset the PLC’s in a manufacturing facility, the cost of lost communication time in a credit card processing facility, and the failure of transmission of important data has a cause, power quality problems. The cost of power quality varies with the cost associated with the process affected, but ultimately the cost reduces profit and productivity. In fact, nothing can improve profitability and productivity as quickly and demonstrably as the improvement of power quality. Now that you understand a little about the nature of power quality problems, let’s look at the things you can do to measure and improve power quality in your own facility.

Collecting Data

The basis of all power quality measurement is the power quality survey. Each circuit is monitored for a period of time sufficient to record a complete cycle of an operation. This might mean a single shift, or a week-long process. Familiarity with the process and the individual loads under test are the key to a successful survey. Conclusions about the general Power Quality conditions for the full year may be derived from the correct survey period. A pre-survey tour and a conversation with key personnel such as plant manager, manufacturing engineering, facilities manager, plant electrician and LAN manager will provide important information about problems associated with power quality which may not be readily apparent, such as telephone operation, worker productivity, downtime, scrapped material, and data communication problems, flickering lights, and premature failure of electrical equipment.

Trend Monitoring

A Power Quality survey consists of two parts, trend recording and event capture. Some Power Quality analyzers can perform both tasks, charts and graphs for trending and high-resolution graphics for examination and evaluation of captured events and harmonic content of voltage and current waveforms. Trending means recording measurements over time, just like a strip chart recorder without the problems associated with moving paper and liquid ink.

Trend recordings of voltage and current, power and energy, harmonics, flicker and other parameters are valuable indicators of the status of the delivered voltage and current, as well as the condition of the load over time. Knowing what an operation looks like when everything is working well is very important when it comes time to identify the cause of a failure.

Th 0804up Thenewt1
Click here to enlarge image

What you choose to measure depends on what is available on the monitor and what makes sense for the particular location in the facility. When choosing what to measure, remember it is average voltages, current, and power factor that determine the baseline load and demand, but it is peak values that define transients. Instantaneous values have little meaning. In any case, “more is better” when deciding what parameters to trend record.

Whether or not those parameters you choose to record are suspected of being a problem, detailed trend information is useful when tracking down a power quality problem. The idea is to monitor and record how power is being used, and the effect of the voltage, current and frequency on the equipment in a facility. Date and time stamps on the recording help to pinpoint the conditions that were contemporaneous with a documented failure.

Choose a time base, or resolution, that is appropriate. Trying to measure harmonics 60 times an hour will give you a lot of data, but to what end? Generally speaking, harmonic content won’t be changing that quickly. Similarly, recording watt-hour consumption every 10 minutes will not be useful in most cases. Why gather unimportant or meaningless information that will just confuse you?

Th 0804up Thenewt2
Click here to enlarge image

Don’t push every time-base to the extreme. The good news is that detailed trend recordings do not need to use up a lot of memory space, so you can get a lot of information in just a small amount of memory. Don’t confuse a logger with a recorder. You do not want to get instantaneous readings over a long period of time. It would be better to choose an instrument with a fast sample rate, which provides average values plus maximum and minimum value readings which represent very short periods of time, as short as a single cycle in some cases. A logger cannot provide this kind of detail.

Alarms, or Triggered Events

The triggered event, or alarm section of the survey, captures disturbances and deviations from average voltage, current, frequency, or any other measured or calculated value. This includes all sorts of distorted waveforms that can be captured on voltage or current. Setting the proper threshold limits that trigger a disturbance recording is important.

CBEMA, ITIC, IEEE 519 or 1159.1 recommendations are a starting point, but nothing beats knowing the dropout point of the equipment under test. If you know that a 20 percent voltage sag that lasts more than a half second will cause a particular drive to fail, the alarm set point is simple. General guidelines are great, but common sense tells us that we need different set points on a CAT-Scan device and a spot-welder.

The power quality questionnaire helps you establish the sensitivities of the load. Some alarms can capture variations in sags, swells, transients, and harmonics levels in values as small as a single cycle. In many cases, that is not an advantage, because the equipment under test can withstand far greater durations of these events. A power quality survey considers the magnitude, duration, and frequency of any detected events, as those events affect the load.

If you don’t know the drop out point of failing equipment, or you don’t have time to determine it, you can still get close. The key to setting trigger thresholds is to find the place just below where a disturbance will cause problems. Good trend recordings will help you to establish that point by comparing documented failures to trended data. The sensitivity of the equipment under test can be easily established.

Until the sensitivity or dropout point is established, to set trigger thresholds runs the risk of recording reams of unimportant information, or missing the events you need to capture. This failure to establish dropout points before setting alarm thresholds is what causes frustration with many new owners of power quality analyzers.

Why Record When Nothing is Wrong?

The baseline power quality survey is performed when systems are operating correctly. It does not restrict itself to what may be perceived as expected power quality problems. Instead, it measures broad trends of all parameters of electric power supply and consumption.

These statistics are recorded in a simple report and filed away. The magnitude or severity of these events, the type of events, and the frequency of events are all important. These events are compared to operations records of the factory or facility to determine the particular sensitivity of each piece of electrical apparatus or device with a reported failure. As this survey is repeated periodically, it should be compared with previous surveys to detect deteriorating power quality conditions.

This may seem like a lot of effort, but when performed at these regular intervals, each survey does not need to be too time intensive. The key is to create a standard survey setup for each location and a standard report format so comparisons between surveys can be made easily.

The benefits of performing these regular surveys are enormous. The power quality in a facility changes every time a new piece of equipment is installed in your facility or in a neighboring facility. The power utility can install new equipment, or tap-changers could fail, causing low or high voltage conditions.

Power Quality failures can occur within a plant when new, more sensitive equipment replaces older, more tolerant equipment. Adjustable speed drives, as an example, will often generate high levels of harmonics that will effect nearby equipment, yet the same drives are quite sensitive to voltage sags themselves. If a facility is electrically ‘downstream’ from large consumers of power, the changing load conditions in all the upstream facilities will have a cumulative impact on the delivered voltage, current, and frequency.

Continuous power quality monitoring is becoming a reality in some regions. Rates for power are already being contracted with an eye to the quality of the voltage delivered. Penalties for power factor and peak demand are common, but don’t be surprised to see harmonics limits and flicker penalties in the future. Electricity is a commodity now to be bought and sold, and any damage a facility does to the waveforms downstream effects the price. Be prepared.

Looking for Answers

Sometimes you just need the answer. It seems we all have way too much information, and way too little knowledge. You can just choke on data if the power quality information you gather is not correlated to time or events, and basic information is not cataloged. Baseline recordings are invaluable when failures occur, but you can’t make baseline recordings instantly. If you don’t have historical information, start at the service entrance, and make some measurements when the facility is in full operation. Then continue the survey at the device that is suffering the failures.

If the solution is not clear, then continue by measuring the largest loads, and run them through a cycle of operation while you record all parameters. Voltage sags are the most common power quality problem, and voltage sags very often are the culprits. Inrush current, unbalanced systems, high harmonic content, rapidly cycling loads, poor power factor correction, bad grounds or shorted neutrals and grounds, poor quality bonds and many other parameters can be associated with power quality problems.

Note normal start-up and shutdown times of operations, and look for some kind of correlating data. In power quality there is no such thing as a coincidence. Everything has an effect on everything else. Locating the cause of a failure can often be greatly accelerated by examining the operation, and determining synchronous operations, the simultaneous operation of which add up to cause a sag which causes a failure.

Make sure to record minimum and maximum values on the trend recordings. Sometimes short duration conditions occur which are hard to duplicate. Knowing your maximum energy consumption will also help you save money when you look at load shedding to reduce peak demand charges. Spreading out peak demand in a facility will have a very fast and long-term effect on reducing the cost of energy.

Don’t sweat the small stuff. You notice that a monitor triggered a dozen disturbance events and nothing went wrong in the plant? Congratulations, you have no problems. Power quality problems have everything to do with the proper operation of the customer’s equipment.

Power Quality events should always be compared with failures and mis-operations. Without proper set points or thresholds, monitors will record events when nothing on the electric system failed. Often too much attention is paid to minor details. Let the point at which failures and mis-operations occur be your guide. What may cause a plant shutdown in one factory may be trivial in another facility. The dates and times of nuisance events should be noted and the event records kept as part of the normal operation of the facility. Re-set your triggers to eliminate the capture of these events.

Power factor and waveform distortion penalties can be reduced, and the problems mitigated. The best place to put correctly sized filters or capacitors is as near to the load as possible. Very often though, the devices are placed near the service entrance, far from the problem. At least the customer will not be assessed a penalty if the correction is on the customer side of the revenue meter.

In some cases the correction has to be placed on the utility side, especially if the disturbances are caused by a neighboring facility. Remember that Harmonic distortion will continue downstream from the isolation transformer or harmonic canceling filter, and over-correction causes as many problems as under-correction, so regular measurement and proper mitigation techniques will guarantee an up-to-date solution as conditions change.

Often, outside contractors or the maintenance department of a factory, office, hospital or data center perform power quality monitoring. Rented equipment, unfamiliar software, non-technical management and unrefined raw data often doom a well-meaning power quality survey, and more important, the money, which could have been saved, is lost for good.

The solution is to own and use high quality familiar equipment, perform regular baseline measurements before a problem occurs, eliminate nuisance alarms by establishing the sensitivity of your facility, and know the dropout points of key equipment. The biggest bang for your buck today is to correct power quality problems. The money drops right to the bottom line. How else can you take up to three percent off the cost of your process?

Preventing Problems - Mitigation

Isolation transformers, K-rated transformers, filters to keep poor power quality sources from power quality sensitive devices and processes, dynamic voltage restorers, high speed static transfer switches, UPS and stand-by power systems, flywheels, rotary UPS, fuel cells, and so many more devices are sold today to solve power quality problems. Which one is the right answer?

The best advice is to identify the source of the problem, and take the least expensive solution. For instance, if your adjustable speed drives are tripping off line, you might measure and find that a periodic sag is the cause. The problem shuts down your production line, and costs more than $10,000.00 to restart each time.

What do you do? You can throw a lot of money at the problem, isolate the drives, install dedicated power, filter, and measure and test continuously. Or, you could find a more tolerant drive for just a few hundred dollars and the problem goes away. Change the device, change the internal environment, or change the external environment, these are the solutions as they increase in cost. Knowing the cost of the power quality problem is essential.

You shouldn’t spend a hundred thousand dollars to fix a ten-dollar problem. Sometimes the electric company, now known in deregulated terminology as the ESP, or electrical service provider, needs to be involved. If your measurements indicate the problem is coming from the other side of the revenue meter, that is the time to bring in the utility. The utility can choose to fix the problem, offer higher cost premium power, or even multiple feeders and a high-speed switch.

If you can provide detailed power quality records over time, your first meeting with the Electrical Service Provider will be much smoother. Most times, the customer is responsible to fix power quality problems caused by the load itself. In any case, it is really helpful to establish the sensitivity of your load in terms of the dropout point, and show a record of the magnitude and duration of power quality events, and how often they occur in your facility. Understanding and experience will help you interpret the results so you can determine the direction of the fault, and eventually pinpoint the actual cause.

The recognition of the cost of power quality has made power quality the hottest topic in the electrical world. Data communications, automation, higher speed transmission of every kind of transaction from financial to medical information is critical to maintaining the quality of life we enjoy. The cost of poor power quality is very high. Every electrical contractor and facility manager ought to be aware of the power quality issues that affect their livelihood. Regimes of regular measurement of power quality parameters will reduce costs and downtime. Low cost high performance devices provided by companies like Ideal Industries are available to establish, measure, and record Power Quality in every installation, from residential to heavy industry.

About the Author: Brian Blanchette is a field sales engineer with the IDEAL Test and Measurement Division.

More in Test & Measurement