Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations SDETERS on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Voltage reduction w/low heat...possible?

Status
Not open for further replies.

MacGyverS2000

Electrical
Dec 22, 2003
8,504
I'm looking for ways to reduce (and regulate) a voltage source from +12.5-14.5 V down to an even +12.0 V, with a current range around 3-4 A. This is a small, inexpensive item, so regulator size and cost is a major concern. It will end up in a car and be handled from time to time, so heat is also a major concern.

Shunt regulators are out of the question due to guaranteed high heat from wasted energy in the resistors (not too concerned over the wasted energy, though). Linear regulators are the current candidate, but again, the high heat levels involved at dropping 10+ W doesn't make me a huge fan of those, either. The only other option I can think of is a separate switching regulator IC, but the cost on that would more than likely be prohibitive.

Am I overlooking any options, or am I stuck choosing between these?
 
Replies continue below

Recommended for you

A "Simple Switcher" needn't cost an arm, and certainly not a leg. Have a look at National Semiconductor's site. Look for "Simple Switcher". Not expensive at all. Good luck.
 
Yeah, the more I look at things though, it seems as if switching supplies really begin to shine when the input voltage is significantly different than the output voltages. For example, a linear regulator will burn through 7.5W at Vi=14.5V and Io=3A. A switching regulator with 85% efficiency will drop it a tad to 6.4W. At Vi=12.5V, the linear regulator actually wins out hands down by burning a "mere" 1.5W.

8W is completely unacceptable. The plastic case would not be just warm, it would be extremely hot! I need a stable voltage for a long string of LEDs so their currents can be calculated and remain constant over varing input voltages. This is the last major hurdle in my project, and I'm now regretting putting it off until the end :(
 
I see your problem. And I guess that you have tried the other options like going for more efficient LEDs. One thing that I actually did once was to PWM the current through the LEDs by using a comparator that sensed the voltage drop over a resistor from the last LED in the chain and GND. The comparator switched the LED string ON/OFF. The switching frequency can be kept fairly low (controlled by filtering the sense voltage with 2 - 5 milliseconds) to minimise switching losses (100 Hz is OK to the eye) and you do not lose more than a few hundred millivolts if you drive the transistor correctly. Using a low resistance FET and low switching frequency will probably be your best choice. And you don´t need any inductor!
 
I wish I could throw my entire design in front of a room full of engineers, but unfortunately for me I'm a lone engineer ;)

My design entails a large number of LEDs in parallel (well, each "LED" is actually 2 LEDs in series for an increased voltage drop). The combined voltage drop of 2 LEDs can be in excess of 10V. I have set the max current in each LED through individual resistors. If I had a stable supply voltage everything would be golden, but a car's voltage swings from about 12.5V when off to 14.5V when the alternator is running. With such a small voltage differential between the LED voltage drop and the supply, minor changes in the supply equal quite large changes in the LED current (the current set at 12.5V can double or more at 14.5V, a very bad thing).

So, I considered putting in some sort of stable supply (but currents could change slightly as LED forward voltages varied from lot to lot). I also considered actually increasing the supply voltage with a boost converter to increase the voltage differential between supply and LED (and thereby reducing current swings as the LED forward voltage changed from lot to lot).

I think the sensible thing to do now might be reduce the LED series combinations to individual packages to increase the voltage differential between supply and LED, leave the supply voltage alone and let it vary a couple of volts, and count my blessings the current won't change more than about 5 mA from one supply voltage extreme to the other. It's a chicken's way out and does NOT make me a happy camper, but it's inexpensive and simplistic.

Of course, I'm always open to ideas from you guys.
 
This idea has good and bad sides to it.

What if I use a boost converter to up the voltage to a stable 16V (no specific reason for 16V, other than 3 LEDs in series could max out to about 15V)? I use a transistor in diode configuration (base connected to collector) with a resistor between the collector and supply, sized to allow a specific current through the transistor. The base of the transistor is connected to other transistors in a current mirror configuration (all bases tied together). This way, only one resistor is needed, the supply is stable, and each set of LEDs will have equal current through them.

Of course, the down side is power dissipation doesn't change much (assuming I'm looking at this correctly). Skipping all of the calculations, the converter will still be around 85-90% efficient. I trade off voltage for current and vice versa, but I still have to dissipate 6-8W of power somewhere.
 
Is this a onesie application? MPJA.com has surplus HP units (in plastic case) 9-24V input and 12V 4A out
 
This will be a several hundred to several thousand a year type of project, that's why it has to be perfect the first time out.
 
What about the charge pumps for LEDs that are announced for laptops and PDAs? The efficiency is good. Maybe there's a way to translate the application into a higher current one like yours.
 
LED drive needs current limiting rather than voltage regulation. A simple switcher with an output inductor sized so it never saturates and limiting the current to the appropriate level will give you exactly that. A linear regulator will give you a lot of heat if it is designed for the voltage ranges you give.
 
Or just put in a series diode, or two, on the input to the linear regulator. The diodes will disapate some of the heat, so the regulator won't have too...
 
felixc, the problem with that method is expense. I would need quite a number of those ICs (and associated components) to regulate all of the LEDs. It would be a perfect solution if I was only using a handful, but I'm talking hundreds scattered over a wide area.


Brian, The voltage regulation was not directly for the LEDs, but as a method to allow me to control the current (within certain bounds). I know the Vf for LEDs will vary from piece to piece, but the minor voltage changes there would vary my currentby a few percent, quite acceptable. I've come to the conclusion a linear regulator will not work for this cituation due not only to the heat generated, but also to the low voltage drop required in the regulator at certain supply voltages.


melone, Diodes or not, I'm still left with 8W of dissipated power to dump into the atmosphere. They would also lower my available supply headroom in the regulator, so even with a LDO regulator I wouldn't have enough leeway to stay in regulation.


I'll be running some tests in the next few days to determine if I can handle the brightness drop between one supply voltage to the other. No other manufacturer I've found so far regulates the voltage, but they also have some advantages I do not which allows them to get away with it.
 
I just realised that you don’t have to use a regulator that handles the range between 12.0V and 14.4V. The battery is either being charged or it isn't. In other words, the engine is either running or it isn’t. You could therefore use a MOSFET to switch a resistor out of circuit when the low level voltage is encountered.
 
LM2642 or LM5642 might work for you.
Almost every body has some kind of buck chip.
You may want to contact an apps.eng. at one
of the many chip manufacturers. ie TI , National ,Fairchiald
,International Rectifier ,or Analog Devices ect..
For the range you want you should be able to get 95%
or more. I just don't know if at 12.5 in you will have
enough over head for 12v out. You may try splitting
the power between two simple switchers. This should
get the eff. up and spread out the heat.

Good luck !
 
You might try using the fet that is pulsing the leds
and change the pulse width dependent on the voltage
to maintain the brightness level.
 
The solution to this may be simpler than we are all thinking. some more information may be helpfull here in diagnosing a correct fix for your problem but i will throw out some ideas.

First off if your goal is to be as efficient as possible you may want to look the other way from where you are going. This is to say that you don't have to reach the voltage drop of the LED to power it you just have to be close. for instance if you are using 5V LEDs as you have kind of led onto you can try using 3 in series without a resistor at all. The circuit should stay safe until you get over about 14.7 volts. Throw some stuff on a bread board and see what kind of current you are pulling. If you stay below the max current you should be safe without any resistors/PS. You will get a lower brightness per led at the lower voltages but you will be making up for this with more LEDs.

secondly you may want to look into other LEDs. You may be able to get an LED that uses a different technology and has a lower minimum voltage requirement. If you can get the LEDs down to say 4V each your resistor will be bleading off 4-6 volts. You can tune the system to give 22ma at max (assuming 20 ma LEDs) which will decrease the life only slightly (10%). At this level you will still be able to pull approx 15ma at the lowest voltages you are likely to encounter. This will not produce a profound difference in brightness (20% or so depending on LED).

Both of these suggestions are assuming you are not using the system for headlights as you may run into cert problems with LEDs there as they will not run as low as the car will (7 V or so before you no longer get ignition) and could pose safety problems.
 
pezas,

Very astute observation skills you have there. I am indeed limiting myself to two LEDs and a resistor due to the large Vf of the LEDs. With such a large drop across the LEDs, I have very little voltage across the resistor. For argument's sake, let's assume the supply will range from 12.5-14.5V and the LEDs grab 10V for themselves. This leaves a resistor dropping anywhere from 2.5-4.5V. If I specify a resistor for 20mA at 14.5V (225 ohms), current drops to 11mA at 12.5V, a drop of nearly 50%...not insignificant, for sure. I could mitigate the percentage variation by moving to a single LED, but the current requirements would double (again, not insignificant). I'm almost damned if I do, damned if I don't.

While driving LEDs strictly with voltage control may have merit in some applications, I think it could go one of two ways...either 1) dangerous for the LED as a higher voltage allows beyond-spec currents to flow, or 2) dismal performance as a too low voltage makes the LEDs glow too dim. Finding a middle ground would be quite difficult for the reasons mentioned in my last paragraph. Even if I could guarantee a slightly tighter supply range, finding LEDs with just the right Vf to get within that range would be an ccomplishment onto itself.

I'm getting the feeling I may have to bite the bullet and accept a fairly wide range of brightnesses based upon whether or not the car is running. I do have a few minor items to my advantage. First, since LEDs are square law devices, dropping the current by 50% doesn't mean I lose half of the brightness (that's what, uh, a 25% drop in brightness, give or take?). Second, as the current drops, the Vf of the LEDs drops slightly, giving a bit more headroom to the resistor...so, as the supply voltage drops the 2V or so, it may only manifest itself as a 30% drop in actual current compared to the first order calcuation of 50% (I'm pulling rough figures out of the air). Both combined, I may see a brightness loss of 15-20% <crossing fingers>. As long as this brightness level remains the same for a long period of time (no pulsing or fluctuation), this may be acceptable.

Does that sound like a fair assessment?
 
I had to test my theory for peace of mind, so I hooked up several LEDs (to also check lot to lot differences). While there was a noticeable difference in brightness between 100% and 50% current, I don't believe the difference significant enough to warrant further worries...you would notice a drop in brightness as the car is turned off, but the brightness should still be of a significant nature. Of course, I'll still think on it from time to time, but I don't consider it enough of a problem to squander my precious resources on. See my further replies below for why

The following is also for future reference and newbies...

According to the charts, the LED is rated at Vf=4.6V at 35mA. I measured 4.20V @ 30mA and 3.53V @ 15mA. Now, I changed from a supply voltage of 5.91V to 4.25V (from another viewpoint, the voltage across the resistor went from 1.71V to 0.72V). The supply voltage changed 1.66V, but the voltage across the current-controlling resistor only changed by 0.99V due to the LED's Vf dropping with decreased current. A first-order approximation then says a change in supply voltage only causes a 60% change in current (I'm neglecting the non-linear nature of changes in Vf versus If).

That being the case and going back to actual component values (nominal specs), the current through my production version will probably only drop about 20% from when the car is running and when it's off (which begins to line up with the first-order approximation/guesstimate I made in my previous post. If a 50% current drop shows a mostly insignificant change in brightness (as I've proven to myself here on the lab bench), I think the change in brightness for a 20% current drop will be all but unnoticeable.
 
just for grins if you are using 4.6V leds try hooking 3 of them up wihout a resistor and measure the through current when the supply voltage is between 12-15V. your brightness should be based on this current. You may find that you don't need to have a resistor at all (or possibly 10 ohm or something like that if you are really worried). Don't worry about led life if you are staying below the rated current of the led.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor