rockman7892
Electrical
- Apr 7, 2008
- 1,156
We are working on developing a standard distribution design from customer owned HV substation to facility distribution. Currently we are evaluating us of 34.5kV vs 13.8kV for this distribution. I wanted to see if there were any rules of thumbs or thresholds based on system capacity and incoming HV utility feed that are typically used for evaluation.
This secondary distribution is fed from secondary of a transformer with a 72MVA base rating and HV incoming of either 138kV or 230kV.
Is there a particular limiting factor with HV incoming that dictates needs for secondary voltage. For example step from HV to 15kV transformer is much less common and unavailable then stepping to 34.5kV?
In terms of capacity customer is evaluating going from (5) smaller rated 30MVA transformers at 15kV to (3) 80MVA transformers at 34.5kV. Obviously a cost savings with less transformers (I believe) as well as Equpment sizing but wanted to see if there were any obvious things I'm overlooking that make 34.5kV an unattractive option? This is a new build.
This secondary distribution is fed from secondary of a transformer with a 72MVA base rating and HV incoming of either 138kV or 230kV.
Is there a particular limiting factor with HV incoming that dictates needs for secondary voltage. For example step from HV to 15kV transformer is much less common and unavailable then stepping to 34.5kV?
In terms of capacity customer is evaluating going from (5) smaller rated 30MVA transformers at 15kV to (3) 80MVA transformers at 34.5kV. Obviously a cost savings with less transformers (I believe) as well as Equpment sizing but wanted to see if there were any obvious things I'm overlooking that make 34.5kV an unattractive option? This is a new build.