Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Help with economic value of voltage deviation reduction

Status
Not open for further replies.

zeyazzy

Electrical
Oct 18, 2013
10
Hi all

I'm undertaking an optimization of the transmission network with capacitor bank installation. 2 common objectives are - 1) reduce active power loss, 2) reduce voltage deviation from 1pu (i.e. make buses V close to 1pu).

I want to know the economic benefits for reducing the voltage deviation from 1pu as I need to justify the investment cost.

Thanks.
 
Replies continue below

Recommended for you

How much does it cost to make vars? How much more does your system consume vars as the voltage drops? As the voltage drops, your lines will consume more vars to move the same amount of power and loads like motors draw more vars as the voltage drops.
 
Hi thanks for the reply.

Yes i agree with your point. Sorry i should've mentioned there is a third objective - 1) minimizing active losses, 2) minimization of VAR investment (i.e. capacitor banks) and 3) minimizing voltage deviation

The cost of making vars can be minimized with obj(2), while also working on obj(1 and 3). The main problem i'm facing now is to attach a economic (i.e. quantifiable in money terms) value to objective 3 (min voltage deviation).

How can i justify minimization of voltage deviation as an objective??

Thanks.
 
In other words,

I know people are interested in keeping voltage close to 1pu as it is the rated level and consumer equipments are designed to work at rated levels. But are there any other benefits?
 
In my models ,which are for transmission voltages, almost no one regulates their generators to put out 1.0 pu. I believe that is to help with voltage drops and maintain system health. The system can handle contingencies better if vars are available. The vars supplied by your capacitor banks are proportional to the square of the voltage so if your system voltage can run 4% high, you'll get 8% more vars out of your cap banks and vice versa if your voltage is low. Some plants run with higher than nameplate voltage to get more out of their capacitor banks and to not run into problems on long feeders. Equipment like motors is often is rated lower than 1.0 pu to deal with voltage drops on long feeders. I would suspect that you'd find more benefits deviating your system slightly above 1.0.

For information on how deviating voltage effects equipment, I would recommend referencing the IEEE Red or Gray book. Both have sections outlining the effects voltage deviation has on equipment (motors, lighting, capacitors, solid state equipment, cathode tubes, ect) utilization and life.
 
Few utilities desire to run an 1.00 PU at all locations. More typically, they will have a desired voltage schedule for major buses. The voltage schedule could range from 0.95 to 1.05 PU depending upon the location and that day's circuit configuration. Also, the voltage schedule may change each hour of the day.

Unswitched capacitors may not do a whole lot for minimizing hour to hour voltage variation, though they will provide an average voltage increase.

The economic value of switched caps depends partly on the availability of other voltage management tools such as transformers with LTC and generators. Both LTCs and capacitor switches have much higher maintenance costs than fixed devices.

Also note that some caps may be installed as mitigation for line outages. The performance of the cap during contingencies may override the ideal N-1 loss considerations.

Keep in mind that there may be distribution substation transformers with LTC's between the transmission system and the distribution loads. Those LTC's will isolate customer devices from typical voltage variation on the transmission system.
 
Thanks for the replies guys. They helped increase my understanding.

Please have a look at this paper - minimization of voltage deviation is one of the objectives. There are many, many other researchers who publish paper with voltage deviation min as core objectives.
How can i justify investment with the objective of min voltage deviation? I know they help in contingencies, as reactive reserve margin etc, but how do i represent these in economic sense? For example, one way i thought of is that this extra "help" from min voltage deviation will decrease the likelihood of system collapse or blackouts, and i can calculate the expected savings (prob reduction * cost of blackout)?
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor