37pw56gy
Electrical
- Jul 17, 2002
- 14
I have a five dollar bet with a co-worker on the correct answer to this question, so please give it your best!
Here’s the scenario: A small (about 1 HP) 120 VDC permanent magnet motor drives a reversible mechanism that imposes a varying mechanical load. The 120 VDC source is a simple full-wave rectifier without any sort of filtering. The rectifier is fed from 60 Hz and only one motor is the only load served by this source. Cable between the motor and its controller may be quite long (up to 2500 feet), but wire-wire capacitance is nil and AWG is selected to keep I*I*R drop to an acceptable level. The mechanism has a torque limiter that prevents motor current in excess of about 15-20 amps.
The problem: Under normal load conditions, the motor RPM is somewhat less than its nameplate speed and terminal voltage is reduced proportionate to I*I*R drop. No problem so far, but this seems to be an important clue.
Under light load conditions, however, something strange (?) happens. The terminal voltage increases to a level substantially over the 120 VDC no-load source voltage. The voltage can be as high as 160 VDC at the motor terminals while the motor is freewheeling. The motor, its controller, wiring, etc., are generously over-designed and in no jeopardy of failure. What causes this phenomenon? How do I justify doing nothing about it to those who have nothing better to do than question my work and delight in watching me squirm for the one answer that so far has eluded me?
I do have a theory: Consider the classical synchronous AC motor (or sync condenser) having an over-excited rotating DC field. In this condition, leading VAR’s are returned to the line to counteract lagging VAR’s thrown-off by predominately inductive loads. The motor is, in essence, a big capacitor, and readily adjustable too. Terminal voltage at the motor would rise substantially if corresponding inductive loads were not present or if the motor is grossly over-excited in proportion to the VAR situation.
How does this relate to a small permanent magnet DC motor? Here’s the idea: Under light torque load, the permanent magnets represent the over-excited field of the sync motor. As with any DC motor, a back EMF is generated in the spinning armature. The back EMF follows the full-wave cyclic waveform. Because the source is full-wave rather than pure DC, this back EMF raises the terminal voltage during the declining portion of each half cycle. With no other load to absorb this energy, the terminal voltage rises substantially.
Interestingly, the voltage does not rise under heavy load conditions. Back EMF is reduced as RPM drops.
To test this theory, two experiments are proposed. First, substitute a filtered 120VDC having little or no ripple. Does the overvoltage condition disappear? Second, feed the motor through a series diode and place a large capacitor across the motor terminals to absorb back EMF. In both cases, the overall voltage/current values should return to the usual I*I*R relationship.
Anyone with suggestions that might provide a more rigorous explanation? Many of those in my particular electrical engineering specialty have no experience with sync motors
Here’s the scenario: A small (about 1 HP) 120 VDC permanent magnet motor drives a reversible mechanism that imposes a varying mechanical load. The 120 VDC source is a simple full-wave rectifier without any sort of filtering. The rectifier is fed from 60 Hz and only one motor is the only load served by this source. Cable between the motor and its controller may be quite long (up to 2500 feet), but wire-wire capacitance is nil and AWG is selected to keep I*I*R drop to an acceptable level. The mechanism has a torque limiter that prevents motor current in excess of about 15-20 amps.
The problem: Under normal load conditions, the motor RPM is somewhat less than its nameplate speed and terminal voltage is reduced proportionate to I*I*R drop. No problem so far, but this seems to be an important clue.
Under light load conditions, however, something strange (?) happens. The terminal voltage increases to a level substantially over the 120 VDC no-load source voltage. The voltage can be as high as 160 VDC at the motor terminals while the motor is freewheeling. The motor, its controller, wiring, etc., are generously over-designed and in no jeopardy of failure. What causes this phenomenon? How do I justify doing nothing about it to those who have nothing better to do than question my work and delight in watching me squirm for the one answer that so far has eluded me?
I do have a theory: Consider the classical synchronous AC motor (or sync condenser) having an over-excited rotating DC field. In this condition, leading VAR’s are returned to the line to counteract lagging VAR’s thrown-off by predominately inductive loads. The motor is, in essence, a big capacitor, and readily adjustable too. Terminal voltage at the motor would rise substantially if corresponding inductive loads were not present or if the motor is grossly over-excited in proportion to the VAR situation.
How does this relate to a small permanent magnet DC motor? Here’s the idea: Under light torque load, the permanent magnets represent the over-excited field of the sync motor. As with any DC motor, a back EMF is generated in the spinning armature. The back EMF follows the full-wave cyclic waveform. Because the source is full-wave rather than pure DC, this back EMF raises the terminal voltage during the declining portion of each half cycle. With no other load to absorb this energy, the terminal voltage rises substantially.
Interestingly, the voltage does not rise under heavy load conditions. Back EMF is reduced as RPM drops.
To test this theory, two experiments are proposed. First, substitute a filtered 120VDC having little or no ripple. Does the overvoltage condition disappear? Second, feed the motor through a series diode and place a large capacitor across the motor terminals to absorb back EMF. In both cases, the overall voltage/current values should return to the usual I*I*R relationship.
Anyone with suggestions that might provide a more rigorous explanation? Many of those in my particular electrical engineering specialty have no experience with sync motors