Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations KootK on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Distributed generation and network voltage 1

Status
Not open for further replies.

Cerkit

Electrical
Jan 18, 2016
99
Hi. Can someone explain why generators on distribution networks result in the increase of voltage on the network?

 
Replies continue below

Recommended for you

Most of the time when you add capacitance to the grid, you increase the voltage.
Since most generators run in the capacitance mode, it causes the voltage to go up.
 
Hi.
Ok just to clarify, if operating in capacitive mode then the current is leading the voltage, and the generator is exporting with a leading power factor yes?

Why does a leading power factor result in an increase in voltage?

Thanks
 
Even running at unity power factor they displace load that normally creates voltage drop.

I’ll see your silver lining and raise you two black clouds. - Protection Operations
 
Ohm's Law

High voltage networks in rural areas in UK are generally radial.

Power flow is from HV substation then say a 5 mile overhead line with small transformers to low voltage at each customer (think of a series of small farms). The designer expects the HV voltage to decrease with distance, due to line resistance and current flow.

The last farm then builds a digester and installs a gas engine generator set to consume the methane given off by the digester. This could be at a kw rating several times the previous combined load of all the farms, so these loads will be fed by the generator and the surplus exported through the substation.

When the generator is connected to the mains at low voltage at the last farm, the only way to push power into the mains is to raise the voltage at the farm end above that at the substation. So the original voltage drop pattern is reversed.

This may cause problems at the farms along the overhead line.

There are solutions to this, but that’s the basics of it. (omitting a lot of things like power factor of the generator, and moving the point where the generator controls its voltage from generator terminals to a point along the line)

Davidbeach said much the same thing, but I ramble on a lot.
 
Just to clarify about terminology and generator power factor - a generator that is lagging is producing vars, not consuming them. This is the normal operating mode. A generator that is consuming vars (while producing power) is operating at a leading power factor.
 
Basically, when power flows down a distribution line there is a voltage drop. When net power flows from the generator to the system, the voltage drop will make the voltage down the line lower that at the generator. The source voltage doesn't change, so the voltage drop from the generator to the system is the same as a voltage rise from the system to the generator.
 
When the generator is connected to the mains at low voltage at the last farm, the only way to push power into the mains is to raise the voltage at the farm end above that at the substation. So the original voltage drop pattern is reversed.

One might think this . . . but it ain't necessarily so.

Real power is pushed out onto the grid from a radial generator by increasing the power output of the prime mover so that its load angle is increased; raising the end of line voltage won't necessarily help this. In fact an induction generator at EOL can actually drag voltage even further down, despite the effects on line drop due to real power flow along various portions of the feeder line being either reduced or reversed.

CR

"As iron sharpens iron, so one person sharpens another." [Proverbs 27:17, NIV]
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor