nealger
Electrical
- Oct 11, 2006
- 2
I am hoping that someone out there can explain something that has been bugging me.
In my power system analysis class I learned that pre-fault loading of a synchronous generator results in an increase in the generator's "voltage behind the reactance", which in turn results in an increase in the generator's fault current contribution. Every text on power system analysis that I have seen backs this up. However, as far as I can tell, the ANSI/IEEE method for calculating short circuit currents completely ignores this effect. Can anybody explain why? I am assuming that this is not a fundamental flaw in the ANSI/IEEE method because it is an accepted standardized method that has been around for a while. I just want to understand the rationale for ignoring pre-fault loading effects.
Perhaps this is explained in ANSI C37.010, but unfortunately I don't have access to a copy and I am not inclined to spend $86 to get a copy unless I know for certain that it will answer my question. I do have access to both the IEEE Red Book and Buff Book, but neither has been any help on this point.
The only good explanation I have been able to come up with is that perhaps the test current for an ANSI rated circuit breaker is equal to the breaker's rated load current plus its rated short circuit current. If this is the case, then the effects of pre-fault load current would be built-in to the breaker's fault current rating and there would be no need to include pre-fault current effects in the calculations.
Thanks in advance for any help.
In my power system analysis class I learned that pre-fault loading of a synchronous generator results in an increase in the generator's "voltage behind the reactance", which in turn results in an increase in the generator's fault current contribution. Every text on power system analysis that I have seen backs this up. However, as far as I can tell, the ANSI/IEEE method for calculating short circuit currents completely ignores this effect. Can anybody explain why? I am assuming that this is not a fundamental flaw in the ANSI/IEEE method because it is an accepted standardized method that has been around for a while. I just want to understand the rationale for ignoring pre-fault loading effects.
Perhaps this is explained in ANSI C37.010, but unfortunately I don't have access to a copy and I am not inclined to spend $86 to get a copy unless I know for certain that it will answer my question. I do have access to both the IEEE Red Book and Buff Book, but neither has been any help on this point.
The only good explanation I have been able to come up with is that perhaps the test current for an ANSI rated circuit breaker is equal to the breaker's rated load current plus its rated short circuit current. If this is the case, then the effects of pre-fault load current would be built-in to the breaker's fault current rating and there would be no need to include pre-fault current effects in the calculations.
Thanks in advance for any help.