davva
Marine/Ocean
- Sep 27, 2004
- 99
I have been asked to review some voltage drop calculations for an autotransformer start induction motor scheme. The system is an island network with 2 GTA sets (circa ~100MW capacity combined) and the motor a 11kV, 15MW induction motor.
The calculations have been performed by the machine vendor and they have been given the minimum system fault level from a third party. The fault level calculations have been generated by a power system model which I do not have access to.
The voltage drop calculations are also generated by the Vendor’s software and only give the headline results of study. The fault level of the network is low and the voltage drop calculations show that the machine can be started via the autotransformer and keep within the voltage tolerance of the network which is +/-10%, only JUST though 9.9% dip!. The +/-10% tolerance is a steady state tolerance.
I have a few general questions relating to the above.
1. If the machine vendor has used the min fault level data which has been provided to them to calculate the voltage dip then they are probably using the equivalent synchronous reactance (Xd) data for the network generators, is this correct? If so, given the margin, should they be modelling this more accurately with transient/sub-transient data for the generators?
2. A schematic diagram seems to suggest the scheme offered is an open transition auto-transformer. I am not aware of the transient tolerances on the network but wonder if a closed transition option should have been considered or whether it is considered economically unfeasible at 11kV due to the additional switchgear and components.
3. I’ve read the effects of open transition switching include light flicker and
possible information data loss and circuit breaker tripping. Would generator AVRs be effected by very large inrush currents of short duration or is the open transition time so short that they wouldn’t have time to react anyway?
Many thanks for any responses in advance.
The calculations have been performed by the machine vendor and they have been given the minimum system fault level from a third party. The fault level calculations have been generated by a power system model which I do not have access to.
The voltage drop calculations are also generated by the Vendor’s software and only give the headline results of study. The fault level of the network is low and the voltage drop calculations show that the machine can be started via the autotransformer and keep within the voltage tolerance of the network which is +/-10%, only JUST though 9.9% dip!. The +/-10% tolerance is a steady state tolerance.
I have a few general questions relating to the above.
1. If the machine vendor has used the min fault level data which has been provided to them to calculate the voltage dip then they are probably using the equivalent synchronous reactance (Xd) data for the network generators, is this correct? If so, given the margin, should they be modelling this more accurately with transient/sub-transient data for the generators?
2. A schematic diagram seems to suggest the scheme offered is an open transition auto-transformer. I am not aware of the transient tolerances on the network but wonder if a closed transition option should have been considered or whether it is considered economically unfeasible at 11kV due to the additional switchgear and components.
3. I’ve read the effects of open transition switching include light flicker and
possible information data loss and circuit breaker tripping. Would generator AVRs be effected by very large inrush currents of short duration or is the open transition time so short that they wouldn’t have time to react anyway?
Many thanks for any responses in advance.