Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations GregLocock on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Replaceing a 250v bulb with a 120v bulb

Status
Not open for further replies.

Arvo

Computer
Jan 1, 2010
4
Hi,
This is my first post here so please go easy on any mistakes I might make. :) My question is simple.. I have been trying to come up with a way to wire a 120v bulb to a 250v circuit. I'm very new to EE and I was doing some research on the issue and came up with this...

I bought an LED I found at my local radio shack and the only thing I could find that would fit and had the highest voltage rating was a 120v 1.5mA. I'm replacing a bulb and wanted an LED instead.

R=(V1-V2)/I
R=(250V-120V)/1.5mA
R=130V/0.0015A
R=86,666.6667ohms (wow big resistor!)

Now... I'm more then likely not going about this the right way but from what I have read into this was right... So is is possible to wire my 120v 1.5mA LED to a 250v 1/3w circuit?

**Notes
The 250v 1/3w is all I have to go by off the bulb harness I'm replacing...

 
Replies continue below

Recommended for you

87k ain't a big resistor. A 0.25W type is large enough but will run hot; a 0.5W type will not be as highly stressed. Or just buy the right product for the job - what's special about this lamp? As an observation, 1.5mA is quite a low current for an LED.


----------------------------------
image.php

If we learn from our mistakes I'm getting a great education!
 
Thanks for the speedy reply! Well there is really two reasons why I'm going the long way around to rewire this...

1) I can't find anyone around here that sales a 250v 1/3w bulb or led that is small enough for this application.
2) I love doing things the hard way because I learn a great deal more with my hands on with the project!

In saying that, I was so stupid when talking about the resistor size... I was looking a a 270ohm resistor at the time and didn't even think about the k in 87k hehe.

Ok so here is what I have to work with unless I go back out to the store... :)

Could I use the 120VAC 1.5mA LED with a 100k-Ohm 1/2 watt resistor?

Arvo Bowen III
 
Yes, if the lamp uses a resistor internally. It might use a capacitor to drop the voltage without the heat losses.


----------------------------------
image.php

If we learn from our mistakes I'm getting a great education!
 
The LED itself looks to have just a single resistor in it. The LED is transparent red so I can see through it but not make out the color code on the resistor.

But my plans were to ADD a 100k-Ohm 1/2 watt resistor to the circuit... just to check, do I need to add a cap to it? I thought caps were just used to store power or regulate it...

I definitely want to NOT have a hot led if I can avoid it.

Arvo Bowen III
 
Your task is fraught with subtle issues that can "fail" your attempt. There are many different ways they could've used to get to 120V operation. Many of them are not compatible with 240V operation. Whatever we tell you could still result in a dead LED or an exploded LED.

My suggestion is to just buy one too.


Keith Cress
kcress -
 
A 0.1 to 0.47uF 250V capacitor would also work in place of external resistor. Perhaps you have some little RC snubbers that normally mount on coils as arc suppressors. A resistor as low as 47K 1/2W would likely work.
 
Since you are trying to replace a 230V 1/3 watt bulb, you could try a neon bulb. A NE-2 neon lamp with two 470K resistors in series should be about right.
 
For safety reasons, do NOT even think of it!
 
"For safety reasons, do NOT even think of it! "

Don't even think of what?.....There have been a half dozen suggestions.

Wisdom comes from knowledge
Knowledge comes from experience
Experience comes from bad decisions
 
Put two 120V lamps in series?



"Theory is when you know all and nothing works. Practice is when all works and nobody knows why. In this case we have put together theory and practice: nothing works... and nobody knows why! (Albert Einstein)
 
I got to thinking of my own suggestion of using a capacitor in series. It may not work! As described, this is just a LED in series with a resistor. We have all read those spec sheets that say 5V maximum reverse voltage on a LED. This has always appeared to me to be one of those legal warnings like on the recirculating cotton towel dispenser that says not to lift heavy machinery with it.

I saw the diode and resistor only used on HV AC more than 20 years ago and thought of it as one of those cost saving Chinese engineer tricks. I tested it then and just about a year ago with line voltage. I didn't get any measurable reverse voltage leakage at 150V.

I knew there just had to be a limit so I ran the test again this time using my Sencore LC101. I have come to love this instrument for the leakage tests it can do up to 1000V. The LED was tested for leakage at 50V, 100V, 200V. Not a uA was seen. When switched to 300V it failed and never recovered.

If a resistor is added, a diode will also be required in series. Stupid engineer tricks only get you so far!
 
Opera,

You had to have realized the breakdown/gap voltage of the cap would be reached eventually... the concept is still sound, you just need a different style of cap (i.e., more expensive) for the higher voltage.

Dan - Owner
Footwell%20Animation%20Tiny.gif
 
Not the cap, it was the LED that failed with voltage. If the LED remained a perfect diode, the cap would charge up to full voltage eventually and current would drop to zero. That is the reason not to use a capacitor.

If just a resistor is added in series the LED will breakdown from higher reverse voltage and be destroyed. It is obvious that some manufacturers understand that a LED can withstand normal 120V line voltage. Adding a diode in series with the 68K resistor will prevent reverse voltage breakdown when voltage goes from 120 to 250V.
 
I forgot to update the post... :) Thanks for everyone's help but I simply ended up taking the 120V Bulb (with some type of resistor built in) and adding a 100K Ohm resistor to the mix. There was already some type of resistor in the circuit that I could not make out due to discoloration from heat over time I guess. I decided to leave that one in there too! I let it run for a while to make sure it could hold without getting hot and/or popping. All seems like it's going to work out great! Thanks again for all your ideas and help but it just turned out to be a lot simpler then expected. ;)

Arvo Bowen III
 
It is most unusual of us to complicate things... [wink]


----------------------------------
image.php

If we learn from our mistakes I'm getting a great education!
 
a normal of the shelve LED works fine with:
20mA and produces about a 2V voltage drop
however since its a type of diode which doesn't coope well with reversed voltage of 250V, a good configuration would be:
a LED with a resistor in series of 248V/20mA =12kOhm
and 20mA x 248V= 0.5watt
parallel over the LED you connect a diode (anode LED to kathode diode and vice versa), this way the reversed voltage will not exceed 0.7V
0.7V x 250/12Kohm gives the power rating of the diode
 
Isn't anyone impressed that an LED makes a pretty good diode in spite of what the spec sheet says?
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor