111R
Electrical
- May 4, 2012
- 114
I'm having some trouble wrapping my head around the voltage distribution for a ground fault. If a Delta-grounded wye transformer has an A phase line-to-ground fault a mile down the secondary line with no fault resistance, the voltage is considered to be zero at the fault point. The entire line-to-ground voltage is dropped across the conductor from the transformer to the fault point, neglecting source impedance upstream from the transformer and transformer impedance.
Since all of this current must return to the neutral of the transformer, there must be a potential difference equal to the resistivity of the return path multiplied by the fault current, correct? So the ground potential rise is equal to this value? If so, how does this voltage gradient look across the soil path back to the transformer? I've always heard that 100 ohms per meter is rough estimate of soil resistivity. So, assuming soil conditions happen to amount to this at a particular location, how can I calculate voltage rise at the fault point and step potential?
Since all of this current must return to the neutral of the transformer, there must be a potential difference equal to the resistivity of the return path multiplied by the fault current, correct? So the ground potential rise is equal to this value? If so, how does this voltage gradient look across the soil path back to the transformer? I've always heard that 100 ohms per meter is rough estimate of soil resistivity. So, assuming soil conditions happen to amount to this at a particular location, how can I calculate voltage rise at the fault point and step potential?