175Views8Replies

Author Options:

LED Series Resistor Answered

I just finished soldering a board to flash an IR LED using a 555 timer, and attached it to 9V PP3 battery, and the LED burnt out.

The LED was rated at 1.7V 100mA and I was using a 100ohm series resistor. By my reckoning I should have only needed a 73ohm resistor, (9V - 1.7V) / 0.1A = 73ohms, so 100ohm was more than enough.

Any ideas?

Discussions

0
user
Patrik

10 years ago

Yeah - you want to look at the line that says 1.2V @ 20mA, not 1.7V @ 100mA. That would give you a resistor of around 470 Ohm.

This is correct, just the other day I've used a 470ohm resistor with a 9v battery and red LED.

Great, I'll have a better look at the catalogue next time.

0
user
evilad

10 years ago

So if I take into account the supply voltage as 9.6V and the LED is rated at more like 50mA, A resistor of 158ohms plus should be fine. I think I'll put in a voltage regulator as well to make sure. Thanks for all your help.

0
user
Patrik

10 years ago

Are you absolutely sure the LED was rated for 100mA?

I did a little bit of browsing around at the various online LED suppliers, and I saw a whole bunch of 50mA and 10mA IR LEDs, but no 100mA ones...

As Gmoon says, using the Max setting is not good, for all the reasons he stated and also that the nominal "fresh" storage can be a bit higher then stated on the case....in this instance, possibly: 9.6 V for some.

The link indicated that, yes. Looking at the PDF, it also lists 100mA as the absolute max.

However, the PDF also lists characteristics (typical operating condition), 4 out of 5 are cited at 20mA, which would be typical for a single LED. So the device was extensively tested; but @ 20mA. Only "LUMINOUS INTENSITY" is 50mA, half of the absolute max.

I would take the max ratings with a bit of skepticism. This is perhaps an example of a manufacturer (Asian-rim) overestimating the capabilities.