Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations SSS148 on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Generation Facility Power Factor/Voltage Control 2

Status
Not open for further replies.

il102

Electrical
Nov 1, 2011
13
I have a question in regards to a project I'm involved with. Specifically, this is a Wind Farm generation facility although the answer to my questions does not necessarily have to be specific to wind farms. The turbines at this particular facility are DFIG machines that are capable of supplying/consuming dynamic amounts of reactive power based on the wind farm controller's direction.

There is a need outlined that the farm must be capable of producing power at a +/- .95 power factor at the point of interconnection. The capabilities of the wind turbines will obviously be limited by the generator and the characteristics of the collector system for the wind farm.

Here is where I get confused - despite that requirement, the farm will always be operating in VOLTAGE control mode per the utility's direction. The wind farm management system will receive a voltage setpoint from the local utility and make the necessary changes to meet this voltage. The way I see it - there are two ways to do this.

A) Change the reactive power output to either boost or lower the voltage. If I understand correctly, with no other changes, producing more vars will make the generation facility more "leading," producing more capacitive VARS, and boosting the voltage at the point of interconnection. Producing less Vars will sway the farm more towards the "lagging" characteristic, consuming vars, lowering the voltage. (Induction generators, by nature, are lagging meaning that they consume Vars, or alternatively, they produce negative Vars, is that correct?)

B) You could also use the On Load Tap Changer of a transformer to meet the voltage requirement, couldn't you? If your generation facility is only operating in Voltage Control mode, doesn't the presence of an On-Load tap changer directly offset the amount of reactive compensation needed at the generation facility?

^The above is where I get confused. I'm confused at the interaction between an on-load tap changer and the reactive power support capabilities when operating a facility under voltage control mode. Anyone able to help make this more clear to me?
 
Replies continue below

Recommended for you

You have a good understanding of the basics and bring up a problem that has created issues for many of us.

The problem is an automatic tap changer on the interconnect transformer and a generator voltage regulator will fight each other.

Assume the GSU (Generator Step Up) Transformer has an on-load tap changer monitoring the utility side voltage and the generator voltage regulators are controlling the generator voltage. When the utility voltage drops, the tap changer operates to increase the transformation ratio to achieve a higher voltage on the output and increase reactive power flow to the utility. But the utility voltage usually doesn't change much because the utility is a relatively low impedance source.

The effect of the tap change then is to lower the generator side bus voltage which is immediately counteracted by the generators boosting the excitation to raise voltage and increase VAR output.

If the utility has a strong intertie, the high voltage still doesn't change appreciably. That leads to another tap change. Depending on the effective impedance of the interconnect, the end result is either the generators and GSU are both at max voltage and maximum taps or an equilibrium is reached due to the system impedance and the voltage regulators' and tap changer's reactive compensation settings.

Since the GSU can't change the high side voltage, we tried connecting it to maintain the generator side voltage. That also maintains our in plant bus voltage when the generators are off line. (Not much of a worry for wind farm). During operation the automatic tap changer and voltage regulators drive each other to the limits rather quickly. Every time the generator tries to raise voltage the GSU lowers it. That continues until one of them hits a limit.

A solution is to operate the tap changer in auto when generators are off line and in manual or DCS control when on line. The control algorithm monitors generator voltage and initiates a tap change if the voltage is nearing the +/-5% limit.
 
Thanks - so lets say the situation is as follows.

System study was completed and the aggregate wind farm, at the POI on the high side of the GSU substation is capable of operating at between a .95 lagging to .99 leading power factor. (The farm maxes out at .99 leading likely because of the limitations of the VAR output of the individual generators and the characteristics of the overall collector system, etc)

So lets everything is operating smoothly. Now, either the utility issues a higher voltage setpoint command, OR the utility side voltage drops. Either way, the Wind Farm controller will attempt to boost the high side voltage.

Would this be the most logical programming for a wind farm controller?

1) Increase VAR output from generators to attempt to boost voltage to the necessary level.

2) If VAR output maxes out at .99 leading but the voltage has still not been reached - then operate the OLTC. With a .99 leading power factor limit, this means the wind farm will barely be able to boost voltage at all from the wind turbines, right? It will almost always need to use the OLTC?

Whereas if the farm could get to .9 leading, it would have far more flexibility to boost voltage?
 
It also seems that the power factor requirement will conflict with the voltage setpoint requirement. The only way to increase voltage will be to increase var flow to the utility which will produce power at a lower power factor.

The problem is the voltage setpoint requirement. As rcwilson noted, the voltage is relatively independent of generator operation unless you are at the end of a long line with a lot of voltage drop (or rise, depending on which end you are looking at).
 
Well in this (slightly hypoothetical) situation we're going to say the utility doesn't care about power factor. The .95 lagging to .99 leading is a characteristic of what the facility is capable of doing. So there wouldn't really be a conflict.

If you need to raise voltage, and you were operating normally at unity power factor, you could change to a .99 leading power factor which would increase var flow out of the plant, and boost voltage. If that wasn't enough of a voltage boost - you would then change the OLTC. I don't see the conflict in this scenario.
 
i102- Your logic will work.

Just a comment on nomenclature. when a generator is producing/delivering vars (over-excited) its power factor is lagging. When the generator is under-excited, its voltage is lower, it consumes vars and the power factor is leading.

Lagging current = lagging power factor = vars to the load, from the generator.
Leading current = leading power factor = vars from the load, to the generator.

Regarding the conflict between the pf range and the voltage range, usually the contract stipulates the power factor range at a given voltage or nominal voltage. If not it should.

For a good reference check IEEE Transactions on Industrial Applications, Nov/Dec 1997 article "Estimation of Reactive Power Export and Import Capability for Non-Utility Generators." or ANSI-IEEE Standard C57.116 "Guide for Transformers Directly Connected to Generators" Both sources have some good charts depicting the limits of MVAR capability versus generator voltage, system voltage, and GSU tap and impedance.
 
Thanks for the response, funny you pointed out the error in my nomenclature as I've been trying to decipher it in my head for the last hour myself. I think I just proved it to myself though and understand it now. Made a new post about that issue actually.
 
The utility requirement, although seemingly focused on power factor, insures that the generator can actually regulate voltage. Well, regulate voltage up to its over/under excitation current limits of the generator.

You question about coordinating the generator voltage control and the tap changer operation is easy. You set the tap changer to nominally give the required voltage schedule on the high side, and then you operate the generator VAr output to get there. I am sure the utility will provide you with a voltage schedule to maintain. As long as your wind plant can get to unity power factor, i.e. self provide its own VAr needs, you should be able to make them happy.

The utilities requirement of 0.95 both ways at the point of interconnection is to protect the utility from having to install capacitors or spendy stuff like DVArs after the fact. This HAS occurred several times in the past due to poorly crafted contracts and poor study practices.

The point about voltage control on a stiff system is relevant. It is very possible that your plant can't really move the voltage around much at all if you are plugged into a stiff system. The utility will know this and provide you with a voltage schedule that reflects this.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor