Thanks a lot for your answers, I thought for a second that the way I was thinking about modeling a distribution system was wrong.
Regards,
Reynaldo Salcedo
That is exactly the same thing that I was thinking. I believe that only loadflow could (might) be done under those conditions, and that is only assuming that the voltages are considered from head of feeders and completely ignoring the substation.
Regards,
Reynaldo Salcedo
Why do some utilities simulation tools ignore the transmission impedance in network models for its analysis? I understand that it will increase the voltage drop and it would not allow proper sizing of conductors. But, it is necessary for sizing of breakers at substations and other devices, also...
Thanks a lot for the help, I also found some information stating that over-voltages from DLG can be larger than over-voltages from SLG in the book Electric power distribution handbook by Tom Short pg 654-658 (just in case someone needs to cite it sometime in the future). Also, IEEE Std C.62.92.1...
Thanks for the reply. The curves will be a great help. I simulated faults on the high voltage side on the transformer and the voltage of the healthy phase to a double line to ground fault were lower than the voltage of the healthy phases to a single line to ground fault, this was expected...
I was wondering if it is possible for the voltage of the healthy phase to a double line to ground fault to be higher than the voltage of the healthy phases to a single line to ground fault. The fault is considered to happen on the low voltage side of a transformer feeding a grid.