AdmiralSnackbar
Electrical
- Jan 23, 2015
- 2
I'm hoping for some clarification on the following:
In classical transformer theory, an ideal transformer will behave in the following way v1=v2*(n1/n2). So if the voltage on one side of the transformer goes up due to an increase in the number of windings, the voltage on the other side will increase. However, in performing powerflow simulations, I notice that an increase in voltage on one side of the transformer will decrease the voltage on the other side, specifically when looking at a 115/69 kV transformer with an LTC on the low-side. Can someone please explain this phenomena to me? I've been told that an LTC can't change a strong high-side utility voltage so the low side voltage will decrease but i'd like a little more background on the theory behind that.
Thanks!
AdmiralSnackbar
In classical transformer theory, an ideal transformer will behave in the following way v1=v2*(n1/n2). So if the voltage on one side of the transformer goes up due to an increase in the number of windings, the voltage on the other side will increase. However, in performing powerflow simulations, I notice that an increase in voltage on one side of the transformer will decrease the voltage on the other side, specifically when looking at a 115/69 kV transformer with an LTC on the low-side. Can someone please explain this phenomena to me? I've been told that an LTC can't change a strong high-side utility voltage so the low side voltage will decrease but i'd like a little more background on the theory behind that.
Thanks!
AdmiralSnackbar