Grimo
Electrical
- Aug 13, 2008
- 3
We have 2 geared motors that are driving into a common load; the actual application is a CCD rake drive but it's easier to describe if you imagine it as a cement kiln with 2 geared motors driving the girth gear instead of one. Both motors are the same type and make (Weg), same power (5.5kW), same speed (1440 rpm) and same current rating (10.8A) on DOL start (no VSD). The drive train starts with vee belts from the motor to a flat helical box which is shaft mounted onto the input of a worm box, the worm box output goes into a set of spur gears, and the output shaft from the spur gears holds the pinion that meshes with our equivalent of the girth gear. Overall reduction from motor to the pinion is around 800:1. Each drive has a strain gauge fitted to the rear of each worm shaft to measure torque through movement of this shaft as the load increases; this signal protects the system from overload by monitoring total torque but the individual signals are also compared for torque imbalance between drives and trips the system if the imbalance exceeds 30%. My question is, how can an increase in load cause one motor to see this increase before the other when they're connected by a girth gear that has no "give" in it like a conveyor might? Surely they would see it at the same time, yet we are plagued with constant imbalance trips. I suspect these trips might be more related to problems in the mechanical drive train or the strain gauges needing calibration (this takes about 10 hours to do and it's difficult to get a production release for this amount of time). We presently have no means of accurately measuring motor work current as an indication of torque (which could be compared with the strain gauges to see if the trend is the same) but I intend to fit units to do this in the next few weeks. Am I on the right track here? Any comments would be gratefully received.