McDaniel8402
Electrical
- Oct 14, 2015
- 3
Hello Folks. New to the forum.
I'd like to get some community input on the topic of neutral grounding resistor protection settings when setting ground fault protection settings on a low resistance grounded system. When I was first introduced to system relaying and protection, the engineers that were training me drilled into my head to always set ground fault settings as low as practical. In real numbers, this meant protecting a NGR that was rated for 400A @ 10sec with a 51G setting of between 50 and 100 amps, set as low/fast(in time) as possible while still coordinating with downstream devices. I always questioned why we couldn't just stay beneath the resistor damage curve.
As time has gone on, i've adopted the philosophy of setting 51G settings as low and fast as practical. However, i've never found any literature that would really support the idea that "anything" below the resistor damage curve is acceptable, or not acceptable.
Per IEEE, 10% of resistor rating is typically considered to be the allowable continuous current rating of an NRG. I'm assuming that the low & fast mentality comes from aiming at the 10% mark, but that's only a guess.
In recent times, i've had folks tell me that setting a 51G at 200 amps for 2 to 3 seconds on a "400A @ 10sec NRG" is perfectly fine, and logically, they should be correct.
Any of you folks have much to say on this? Any good case studies that would suggest a best practice? Did I accept gibberish from my old trainers?
I'd like to get some community input on the topic of neutral grounding resistor protection settings when setting ground fault protection settings on a low resistance grounded system. When I was first introduced to system relaying and protection, the engineers that were training me drilled into my head to always set ground fault settings as low as practical. In real numbers, this meant protecting a NGR that was rated for 400A @ 10sec with a 51G setting of between 50 and 100 amps, set as low/fast(in time) as possible while still coordinating with downstream devices. I always questioned why we couldn't just stay beneath the resistor damage curve.
As time has gone on, i've adopted the philosophy of setting 51G settings as low and fast as practical. However, i've never found any literature that would really support the idea that "anything" below the resistor damage curve is acceptable, or not acceptable.
Per IEEE, 10% of resistor rating is typically considered to be the allowable continuous current rating of an NRG. I'm assuming that the low & fast mentality comes from aiming at the 10% mark, but that's only a guess.
In recent times, i've had folks tell me that setting a 51G at 200 amps for 2 to 3 seconds on a "400A @ 10sec NRG" is perfectly fine, and logically, they should be correct.
Any of you folks have much to say on this? Any good case studies that would suggest a best practice? Did I accept gibberish from my old trainers?