jjustice
Electrical
- Aug 29, 2009
- 12
If I increase the output frequency on a given drive from 60Hz on the output to 120Hz, what can I expect as far as output-side transformer loss changes? The output voltage would remain the same. I am pretty sure winding losses will go up, but based on practical experience, by what factor? I suspect that halving the volts/hertz ratio will drop the flux density by an amount that will negate any increases in core eddy and hysteresis losses and give good no-load performance. I am concerned about the cooling on the unit under a loaded condition however. Any help would be greatly appreciated!