MacGyverS2000
Electrical
- Dec 22, 2003
- 8,504
First off, many thanks go to steveowens for taking time out of his busy schedule to discuss my circuit ideas, both through email and on the phone. Kudos! He has certainly helped me move from more complicated ideas to something more simplistic, but I cannot expect him to spend all of his time helping little ol' me.
What exists:I would like to hear the opinions of others as I try to lock down this piece of the circuit. I am controlling a variable number of parallel multi-color LEDs with PWM, each color independantly controlled with a separate PWM stream. Each LED has it's own current-limiting resistor. A PWM period is 32ms long with 1ms resolution, with duty cycles of 0-100%. To be honest, I could probably decrease the period by several factors of 2, but let's start with what I'm currently working with. Voltage supply will most then likely be a regulated +12V to simplify determining max LED current (but I'm all ears on other methods to solve that issue).
Problem: I would like to rid myself of the blinking visible at less than 100% duty cycle, especially at 0-50% duty cycles (I know my low PWM period is partly to blame for the visible flickering). Also, I would like better control of the brightness, especially at low brightness settings...since I'm pulsing at full on/off, even at 3% duty cycle the LED is quite bright (don't have the benefit of an incandescent bulb's filament intergrating effect).
What's been tried:It has been suggested that an LC network with a flyback diode (similar to a switching power supply design - boost/buck regulator, if you will) is the path I should be chasing. So, I've spent considerable time playing with values in an attempt to make a reaosnable circuit. This shoudl be as close to surface mount material as possible (or very small through-hole), so I set a self-imposed limit of <500uH and <20uF (since I have had trouble finding an inexpensive SM cap over 20uF).
Interesting simulation, but the voltage overshoot would blow things sky high. So, I added in a zener after the inductor to clip the voltage...worked reasonably well. Ringing could be removed with a sizeable cap (well over 20uF, though).
Where I'm at now:In doing all of this sim work, I think I lost sight of my goal...controlling the LEDs. After all was said an done, the voltage across the LEDs was still a close approximation of the square wave. If the PWM pulses controlled a FET for each LED color to begin with, slightly smoothing out the square wave train is hardly going to have any effect on the flickering. I'm sure it was thr original circuit's attempt to smooth out the incoming power (and reduce the flicker that way) rather than the PWM pulse, but I've got so many conflicting ideas running through my head I can't sort them out any longer.
Someone push me down a path that makes sense again...
What exists:I would like to hear the opinions of others as I try to lock down this piece of the circuit. I am controlling a variable number of parallel multi-color LEDs with PWM, each color independantly controlled with a separate PWM stream. Each LED has it's own current-limiting resistor. A PWM period is 32ms long with 1ms resolution, with duty cycles of 0-100%. To be honest, I could probably decrease the period by several factors of 2, but let's start with what I'm currently working with. Voltage supply will most then likely be a regulated +12V to simplify determining max LED current (but I'm all ears on other methods to solve that issue).
Problem: I would like to rid myself of the blinking visible at less than 100% duty cycle, especially at 0-50% duty cycles (I know my low PWM period is partly to blame for the visible flickering). Also, I would like better control of the brightness, especially at low brightness settings...since I'm pulsing at full on/off, even at 3% duty cycle the LED is quite bright (don't have the benefit of an incandescent bulb's filament intergrating effect).
What's been tried:It has been suggested that an LC network with a flyback diode (similar to a switching power supply design - boost/buck regulator, if you will) is the path I should be chasing. So, I've spent considerable time playing with values in an attempt to make a reaosnable circuit. This shoudl be as close to surface mount material as possible (or very small through-hole), so I set a self-imposed limit of <500uH and <20uF (since I have had trouble finding an inexpensive SM cap over 20uF).
Interesting simulation, but the voltage overshoot would blow things sky high. So, I added in a zener after the inductor to clip the voltage...worked reasonably well. Ringing could be removed with a sizeable cap (well over 20uF, though).
Where I'm at now:In doing all of this sim work, I think I lost sight of my goal...controlling the LEDs. After all was said an done, the voltage across the LEDs was still a close approximation of the square wave. If the PWM pulses controlled a FET for each LED color to begin with, slightly smoothing out the square wave train is hardly going to have any effect on the flickering. I'm sure it was thr original circuit's attempt to smooth out the incoming power (and reduce the flicker that way) rather than the PWM pulse, but I've got so many conflicting ideas running through my head I can't sort them out any longer.
Someone push me down a path that makes sense again...