prc
Electrical
- Aug 18, 2001
- 2,008
Let me refer to the following thread where rcwilson explained nicely about the above aspect of transformers. thread238-301672 (2011)
When a transformer of 100 MVA 10 % impedance is feeding power to grid from generator, there will be a drop (reduction ) of 10 MVAR in transformer. It results in improved power factor at transformer output, compared to the pF at generator output due to the MVAR "consumed" by the transformer.
There can be 3 situations. (a) the transformer HV voltage (say 132 kV) same as grid voltage of 132 kV (b) Transformer voltage at the time of synchronizing at +10 % voltage (145 kV) (c) Transformer tap at -10 % voltage. Now I have a question. How the MVAR consumed in transformer will vary under (b) & (c) and how much. As I understood, under (b) more MVAR will be used in transformer and under (c) more MVAR will be taken from grid. But how much? Under what conditions of grid these options will be required?
When a transformer of 100 MVA 10 % impedance is feeding power to grid from generator, there will be a drop (reduction ) of 10 MVAR in transformer. It results in improved power factor at transformer output, compared to the pF at generator output due to the MVAR "consumed" by the transformer.
There can be 3 situations. (a) the transformer HV voltage (say 132 kV) same as grid voltage of 132 kV (b) Transformer voltage at the time of synchronizing at +10 % voltage (145 kV) (c) Transformer tap at -10 % voltage. Now I have a question. How the MVAR consumed in transformer will vary under (b) & (c) and how much. As I understood, under (b) more MVAR will be used in transformer and under (c) more MVAR will be taken from grid. But how much? Under what conditions of grid these options will be required?