Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations MintJulep on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

345kv vs 500kv 5

Status
Not open for further replies.

Mbrooke

Electrical
Nov 12, 2012
2,546
For areas where 345kv is sufficient for both capacity and distance, is there any advantage to using a 500kv bulk power network? Such as having higher Kv equipment but at a lower current rating, ie lines, cables, 550kv 2000amp 40ka breakers vs 362kv 3000amp 63ka breakers.


I know that is very broad question- like asking what ocean life or bacteria might evolve into billions of years from now- but any standing specifics or general facts that come to mind such as cost difference? I'm all ears.
 
Replies continue below

Recommended for you

I would be looking at at least two factors.
1. The cost of the 345 kV equipment vs the cost of the lower rated 500 kV equipment.
2. The estimated losses of both systems: Both I2R losses and corona losses.
There may be other factors but that may be a good starting place.


Bill
--------------------
"Why not the best?"
Jimmy Carter
 
Good points.

Any known values between 345kv and 500kv equipment for those who have experience in pricing? I know little about the price of 500kv equipment.
 
Don't forget to add in consideration of the voltages already available. If the buses on both ends are already at 345kV, then any savings in downsizing the equipment in amperage may be swallowed up by the transformation costs to get to 500kV on both ends.
 
For what you're describing I can't imagine any scenario where the 500kV gear would be cheaper. I think it would be significantly more expensive. There are a tonne of variables involved here and not all of them are technical. Also, you are assuming fault currents go down with higher voltages, but that's not necessarily true so I doubt you'll be using lower rated breakers.
 
Marks1080 said:
For what you're describing I can't imagine any scenario where the 500kV gear would be cheaper. I think it would be significantly more expensive. There are a tonne of variables involved here and not all of them are technical.


Any idea by how much?


Also, you are assuming fault currents go down with higher voltages, but that's not necessarily true so I doubt you'll be using lower rated breakers.

Talk to me more about this. Perhaps we are thinking different scenarios.
 
No idea how much extra. There's probably as many variables within the RFP process than anything technical to influence price. But, generally speaking the cost of insulating for higher voltages doesn't go up proportionally, I think it goes up geometrically (inverse square law right?). Also, right of ways have to be increased, so how much for that real estate? How much for the extra steel in the towers? Are there labour issues working on larger towers that you were unaware of? You probably want to use larger transmission cable, because why the hell would you limit a 500kV system by undersizing the cable.

Also, the costs that go into a power system have to be recovered by the loads. If a 345kV system is the appropriate one to use for an area I dont think your customers will be thrilled to find out they're paying off a 500kV system instead.


Regarding the Fault Currents:
Fault currents depend on the available power to supply the fault. A typical bottleneck would be a transformer, for example (but any piece of equipment could be your bottleneck)... a 345kV unit will likely have a lower MVA rating than a 500kV unit, therefore the 500kV transformer will be able to deliver more power to a fault. This is all very general. I think for a larger system it would be fair to say (as a rule of thumb) that fault currents probably get higher at higher voltages - but this is not always true. It really is system dependent. On a small system I believe you would be correct - in general the 500kV faults would be of lower magnitude, but that's just due to the small system itself being the bottle next in terms of power available to deliver to a fault. Usually a the reason we have a HV system (500kV +) is to marshal out huge amounts of generation around an area, so USUALLY there will be a tonne of available energy to feed the fault. If your system is small enough a fault could take it out completely, vs. having a large system which could continue to feed a fault without major consequences. As far as I know this isn't possible or realistic for a 500kV system. I'm in North America - part of NERC and NPCC. From what I know about the entire north eastern power grid in north america and sustained 500kV fault has a high potential to take out the grid.


 
marks1080 said:
because why the hell would you limit a 500kV system by undersizing the cable.

But why oversize the cable if load of that level is never anticipated?



Regarding the Fault Currents:
Fault currents depend on the available power to supply the fault.


Correct- but imagine a system where the entire bulk power system is either 345kv or 500kv interconnecting many 1,200MW generating stations and load clusters miles apart. Won't the 345kv fault current always be higher near generation (and usually most everywhere) as apposed to the 500kv? Same generation- just one case those same MVA GSUs pump out more current under any condition.



If your system is small enough a fault could take it out completely, vs. having a large system which could continue to feed a fault without major consequences. As far as I know this isn't possible or realistic for a 500kV system. I'm in North America - part of NERC and NPCC. From what I know about the entire north eastern power grid in north america and sustained 500kV fault has a high potential to take out the grid.

I don't think thats entirely a fair analysis IMHO- couldn't a sustained fault on a 345kv line do the same if the bulk of generation is connected at that voltage level? My understanding is that in the upper eastern portion of North America such as NY state and New England the entire system is 345kv as apposed to 500kv. Those areas have very limited support from 500Kv.

 
I agree with your counter points. It's very much system specifically dependant, which makes general conversation difficult.

I do believe that most of the time (95% +) what you're suggesting would not be cost effective.
 
But Mark, I think you gave a perfect real world example we can work with. Picture if most- or rather all of the 345kv in NY and New England was 500kv.
 
My assumption, without any real data, would be that if the entire NY/NE 500kV system was magically converted to a 345kV system the grid would collapse because of way too many variables to list.... You'd be affecting system impedance in such a way that I doubt the existing 'normal' load flows would stay constant, or within an acceptable range. Start messing with the major load flows in the north east and your system will go down - see 2003 blackout.

Honestly, the most cost effective thing we could do today to lower the cost of power to the end user would be to eliminate the market structure that's been created. This entire layer of the industry really doesn't do anything except dramatically increase the cost of providing power. That's a tough talk to have with the MBA decision makers who run todays system, all who depend on this unnecessary layer of 'Corporate Vars*' to get a paycheck.



* - I wish I could claim credit for the term 'corporate vars.' I stole it from a colleague :)


 
If my recollection is correct, NY has their 345 kV and 765 kV as well, so it is not all 345 kV transmission.
 
@Magoo2: correct, but 765kv is limited, mostly as an interconnection to Canada from my understanding. 345kv is most of the bulk backbone, even for the NYC area.


@Marks1080: Perhaps, without taking for or adjusting for any other variables beforehand. Maybe I am wrong, However I would theorize that 500kv is more likely to survive a 2003 type disturbance vs a 345kv back bone. In fact I would argue that is why PJM faired better with their 500kv system- and why 765kv took off in the Ohio area latter.

Second load flows on the 115kv, 138kv and 230kv systems would decrease due to the lower impedance of the bulk backbone. Although fault current may go up at this level- however for the sake of this discussion I think we can ignore that.

But from your real world experience, you can say that 500kv equipment in of itself (circuit breakers, isolaters, CTs, VTs) will always cost more than 345kv?


Marks1080 said:
Honestly, the most cost effective thing we could do today to lower the cost of power to the end user would be to eliminate the market structure that's been created. This entire layer of the industry really doesn't do anything except dramatically increase the cost of providing power. That's a tough talk to have with the MBA decision makers who run todays system, all who depend on this unnecessary layer of 'Corporate Vars*' to get a paycheck.



* - I wish I could claim credit for the term 'corporate vars.' I stole it from a colleague :)


I think we can both agree here. And maybe get rid of NERC requirements/ government over-site. However I will leave my opinions at that for now.

"corporate vars"- I will have to use that someday :)
 
"But from your real world experience, you can say that 500kv equipment in of itself (circuit breakers, isolaters, CTs, VTs) will always cost more than 345kv? "


Yes.
 
Seems like voltage class of the lower voltages also matters since typical voltage classes are separated by a ratio of 1.732 to 3. For example 115 kV, 230 kV, and 500 kV or 69kV, 138 kV, 345 kV, and 765 kV are common system designs. In a system with only 138 kV and 500 kV options, there would be a very large jump in both capacity and cost when deciding which voltage to utilize.
 
@Bacon4Life: Excellent point and worth mentioning. Typically 500kv is stepped down to no less than 161kv, often 230kv too. Similarly in Europe 400kv is usually stepped down to no less than 132kv. Reason being is that auto transformers become more costly (basically they become a standard 'isolation' unit in size/cost) when the step up/down is more than 3x.
 
There's a wealth of transmission unit costs to be found in Link

Table 2-1 shows line budget costs per mile for each voltage.
345kV = $1.34M/mile
500kV = $1.92M/mile

Tables 3.1, 3.3, 4.4 show similar higher costs for 500kV substations vs. 345kV substations


 
^^^ That paper is gold, much thanks :)


The cost for a 115/500kv vs 230/500kv auto has me surprised. Can someone explain?
 
Mbrooke: I'm assuming you're surprised because the costs are so similar?

It makes sense for an Auto... as there is only one winding. So for a 500/230 or a 500/115 unit still has the same 500kV main winding. The biggest cost in the unit is the core, and I believe either of these auto's would have the same core. The minor price differences probably come down to secondary equipment and insulation ratings.

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor