Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Motor Controller Fundamentals

Status
Not open for further replies.

DBCox

Automotive
Apr 9, 2003
58
Hi everyone, I have what may be a really dumb question. A dc motor controller controls the speed of the motor by adjusting the output voltage. This voltage is usually linearly related to the speed of the motor. The needed current is also supplied to the motor through the controller (assuming the maximum is not met). So heres where I get confused. If I am using a 24vdc system with a motor that runs at 2400 rpm @ 24v, thats 100rpm/v. The 24v power source is made up of 2 12v batteries in series. So, if I want to run the motor at 1200 rpm, 12V is sent to it. Now, lets say that motor requires 50 amps at the given load. That equals 600 watts of power. If I assume 100% efficiency of the system, what is being drawn from the batteries? 12V and 25 amps each (600W), 6V and 50 amps each (600W), or 12V and 50 amps each (1200W)?

That leads me to another question. If I am drawing less than the max voltage at the battery, how does that affect its power/reserve rating? Can I "back calculate" a watt/hr rating from the amp/hr rating and adjust for different voltages? This may be an unnecessary question because I *think* the voltage remains constant and the current adjusts to supply the proper power, but I would like to verify that.

Another issue is I assume 100% efficiency in my example. I am sure that is not the case and I am pretty sure the controller and battery efficiencies fluctuate with output voltage and current. This is probably component specific, but is it typical to see very large fluctuations?

I hope that makes sense. Thanks for the help!
 
Replies continue below

Recommended for you

Your motor controller probably utilizes some type of switching power converter (PWM buck, etc.) to alter the output voltage, otherwise the system would be terribly inefficient. As a result, the current at 12V is not the same as the current that flows throught the batteries.

Assuming perfect conversion (i.e. no losses in the controller, in reality maybe 75-95% efficiency) the current through your batteries at 12V would be about 25A rather than 50, since in the DC case P=VI, and if there is no power loss in the converter then P(24V) = P(12V) which means that since the voltage was cut in half going from 24V to 12V the current must double, which makes the battery currents 25A. Since the batteries are in series, the same current flows through both, and assuming they are identical batteries (again P=VI) then the Power supplied by each would be 12*25=300W per battery. Of course this is the ideal case. In reality there are losses due to battery internal resistance, conversion from 24V to 12V, line resistance, etc.

The battery voltage should remain relatively constant (decreasing as it discharges). The current supplied by the batteries is the variable quantity. How long the battery can sustain a given current is a property of the battery chemistry, size, etc. Typically, though, an A/Hr rating is given for a certain discharge rate, and the battery will exhibit different A/Hr ratings for different currents. Typically the A/Hr rating goes down significantly as current is increased beyond the current used to compute the A/Hr rating.

Typically controllers and batteries both become less efficient as more current is demanded of them. With the controller you may notice probably <10% change in efficiency taking the current from the specified min to the specified max, but the battery efficiency drop may be quite dramatic depending on the battery chemistry.

Hope that helps you out.
 
Hi blakrapter.

I agree with MrBananas.
The current in both batteries is the very same since they are series connected.
The batteries voltage is almost constant into the load capacity range or battery life ampere-hours.
The controller or drive changes the voltage output to the motor not the power input voltage.
See sketch below.
dpkf39.jpg
 
Couple of thoughts to add:
motor speed vs. voltage linear range is limited.
Motor windings are optimized for certain voltage range,
if you pump too much current, you'll saturate the winding.
This happens when you try to run a high voltage motor with a low voltage supply. Need to look at the speed torque (current) curves.
 
I say it's not just a simple answer of power in vs power out.

Consider this. A pwm converter running from 24V will be supplying either 24V or 0V. It's a switch that is either on (24V) or off (0V). Running at 12V output it's supplying 24V for 50% of the time. To get 50A average current requires 100A at a 50% duty cycle. So, you have 24V x 100A x 50% = 1200W.

But, depending on the converter and motor you may also be measuring back-emf current flowing through a flyback diode during the pwm off time. Now, you really are getting 50A flowing continually so during the pwm on-time the current is 50A. So, 24V x 50A x 50% = 600W.

So, I guess I'm saying you would really have to measure the switch duty cycle and current flow to know. Or, you need to measure the current on the battery side of the converter. You really can't rely on the motor side current and voltage.



 
Thanks for the answers guys. The reason I asked is I will probably be running a 24v system but would like more battery capacity. I have room for 1 more battery, which means I can run 12 or 36, 12 isn't enough. Most of the controllers I have looked at can handle 36V w/o a problem, but allow you to limit the output to whatever. The idea is to use 36V in, but only allow 24v out, thus protecting the motors AND adding 50% more battery life. The trick now is finding a controller that can efficiently do this. I know all speed controllers CAN, b/c that is their purpose (at least the ones i have been looking at, all have been PWM), but i have to check the efficiencies. If the efficiency drops too low, it may not be worth it. Thanks for the help!
 
The amount you can get out of a battery decreases with increasing current draw. You will get less power (total) at the one hour rate than you will at the four hour rate. For example a 100 amp hour battery.
1 hour rate = 100 amps
4 hour rate = 25 amps.
20 hour rat = 5 amps
Total power @ 20 Hr rate > power @ 4 Hr > powr @ 1 Hr.

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor