lmc82
Military
- May 31, 2012
- 5
Hi,
I've searched this thread and can't find an answer to the following question, so here goes.
We have a camera pan and tilt gimbal with direct drive 3-phase brushless DC motors mounted on each axis. We have the control system setup so that the camera is pointing to one spot regardless of the motion of the base (stabilisation). We have added a slight drift to the azimuth stabilisation control (makes the following problem repeatable); movement is approximately 2 electrical degrees every 5 mins. When we start the system up and turn on stabilisation, everything works fine.
Now the problem is this. We excite the base with a vibration profile for approximately 10mins while the gimbal is stabilising. The PWM duty cycle applied to the azimuth motor varies between 10 and (close to) 100%. Once the vibration profile is turned off, we leave the gimbal stabilising. Very slowly the control system increases the PWM duty cycle to the azimuth motor, which increases current until the overcurrent protection is tripped. The output torque of the motor drops as the PWM duty cycle is increased. So it looks like the control system is increasing the duty cycle because the motor is some how losing output torque.
The commutation is trapezoidal, with complementary PWM applied to the low side gate (e.g., the high side gate of Phase A is switched on/off, the low side gate of Phase A is turned on/off (opposite to the high side to avoid shoot through), with the low side of phase C always on). We have also tried trapezoidal without complementary PWMing (i.e. letting the free wheeling diode handling the recirculating current instead of the gate). The gates are IGBTs.
The reasons I can think of for losing torque are:
1. The control system is applying power to the wrong phase. This doesn't seem to be the problem since it doesn't occur when the system is cold.
2. There is a mechanical problem with friction that only occurs after vibration. I'm about to confirm this isn't the problem, however I really don't think this is the case.
3. The motor core is saturating (or something?) since we are essentially not moving and PWMing one phase, which is increasing the back EMF and reducing torque. However I would expect a drop in current (or am I thinking about this incorrectly), but the current increases to the same level as it would be with maximal torque.
Any ideas or suggestions would be greatly appreciated.
Cheers,
Leon
I've searched this thread and can't find an answer to the following question, so here goes.
We have a camera pan and tilt gimbal with direct drive 3-phase brushless DC motors mounted on each axis. We have the control system setup so that the camera is pointing to one spot regardless of the motion of the base (stabilisation). We have added a slight drift to the azimuth stabilisation control (makes the following problem repeatable); movement is approximately 2 electrical degrees every 5 mins. When we start the system up and turn on stabilisation, everything works fine.
Now the problem is this. We excite the base with a vibration profile for approximately 10mins while the gimbal is stabilising. The PWM duty cycle applied to the azimuth motor varies between 10 and (close to) 100%. Once the vibration profile is turned off, we leave the gimbal stabilising. Very slowly the control system increases the PWM duty cycle to the azimuth motor, which increases current until the overcurrent protection is tripped. The output torque of the motor drops as the PWM duty cycle is increased. So it looks like the control system is increasing the duty cycle because the motor is some how losing output torque.
The commutation is trapezoidal, with complementary PWM applied to the low side gate (e.g., the high side gate of Phase A is switched on/off, the low side gate of Phase A is turned on/off (opposite to the high side to avoid shoot through), with the low side of phase C always on). We have also tried trapezoidal without complementary PWMing (i.e. letting the free wheeling diode handling the recirculating current instead of the gate). The gates are IGBTs.
The reasons I can think of for losing torque are:
1. The control system is applying power to the wrong phase. This doesn't seem to be the problem since it doesn't occur when the system is cold.
2. There is a mechanical problem with friction that only occurs after vibration. I'm about to confirm this isn't the problem, however I really don't think this is the case.
3. The motor core is saturating (or something?) since we are essentially not moving and PWMing one phase, which is increasing the back EMF and reducing torque. However I would expect a drop in current (or am I thinking about this incorrectly), but the current increases to the same level as it would be with maximal torque.
Any ideas or suggestions would be greatly appreciated.
Cheers,
Leon