Author Options:

why does an led have to be given a resistor before it Answered

if a device draws only its required amount of current from a supply,. then why does an led have to be given a resistor before it? wont it just take 20 mA from a 3 volt 5 amp power supply?

i checked many instrctables for it but couldnt find a proper explanation for my answer so i thought of asking it..

and people say that led should not be directly connected to any supply as it may blow or get damaged or burn..

but i have done that many times and also had made a torch for my uncle using two white leds of 3.5v( not sure but they are big) and used a 9 v battery without any resistor...it lasted till the battery drained but never blasted... and yes its in series..

and thanks in advance for answering me.. and sorry for bad image quality..used my phone


The LED is a semiconductor device - an active component. The current rating is not what it will draw but the maximum it can stand before over heating and melting.

The LED is almost a short circuit when connect so it will work - forwards biased - so if it can it will draw as much current as the battery or power supply can provide - a bad thing.

Often LEDS - especially high powered ones are operated via a constant current device to remove all these issues.

To operate the LED you need to limit the current to the recommended level to avoid destroying the device. this way you can very the supplied voltage a little and still keep the current within bounds.

Take the LED voltage away from the supply voltage thus leaving the voltage you need to drop across the resistor.

You now need to use Ohms law Volts=Current x Resistance to calculate the value of the resistor for a give current - the LED operating current. If an LED needs 3 volts and operates at 10mAmps and the supply is 9 volts then: 9-3=6 volts across the resistor. V=I x R can be changed round to be R=V / I So R=6 / .010 amp (10 milliamps.) = 600 ohms.

So a 600 ohm resistor at 6 volts will draw 10 milliamps. The current that passes through the resistor also passes through the LED so all is OK.

thank you very much for the detailed reply...i didnt know that leds can draw current infinitely until burnt...and you also gave the explanation for why it does that... and good thing you gave the calculation example also.. and i had also uploaded an image of a 5 minute battery i made using 2 leds but it didnt show up with the question

Another question based on the question above. Does the resistor have to go BEFORE the diode? Or as long as there is a resistor in the circuit the diagram is good.

i have a very weird question here. if you intentionally burn out a resistor, can it fail short, damaging electronics? thanks.

just wondering, if I put three 3.4V LED in series and I do this twice and connect both series groups to a single 3.4V LED , it seems to me that .. ( 3x3.4V = 10.2V ) the groups connected to a single LED bring the final voltage to 11.9 = 10.2v + (3.4V / 2 ) .

lets see if I can make example:
----3.4 ---- 3.4 ---- 3.4 ----|

|---- 3.4 -----
----3.4 ---- 3.4 ---- 3.4 ----|
I don't want my power dissipated on a resistor.. seems a waste.

The LED wants 20mA to be used most effectively. More current and it will start to over heat and potentially be damaged, and less current it will not light as brightly or at all.

20mA is the recommended optimal value, 5 amps is 250 times that much and will incinerate it quickly.

You must also take into account the forward voltage, or the recommended voltage of the LED. It probably wants between 2 and 2.5 volts to run most effectively. Any more and it will burn any less and it wont turn on.

Also, 5 amps is the maximum amount of current the power supply will give. Meaning if you short it you would get 5 amps, but you could get any amount of amps below 5 out of it with the right resistor

To find a resistor that will turn 3 volts at 5 amps into 2.5 volts at 20 mA, we use ohms law.
V=IR (Voltage equal the Current (in amps) times the resistance in Ohms)
we can rewrite that as R=V/I
In this case R= (3-2.5)/.02 (The voltage across the resistor is the total voltage minus the voltage drop of the LED, NOT the total voltage)
so R=.5/.02 = 25 ohms. Unfortunately you probably can't get a 25 ohm resistor so you must use a close value like 33 or 45 ohms.

The post below mostly covered this. If you want further information abotu Ohms law or LED resistor calculation check this site out


we have add a resistor because never a cell give out exact voltage as metioned you can check this by using a multimeter. The resistor give out a stable voltage and also prevent the L.E.D from dying.

This is an excellently posed question, and Rick has given a great answer! Even though it's already answered, I have Featured it so that, perhaps, other I'bles readers can learn from it. Thank you!

thank you....well heres the 5 minute torch i made...trying to upload the image in the comment instead of the question as it didnt work then..

Image013 enhanced.jpg

Those higher-powered LED's probably are designed to be used like that, and they have built-in power control circuitry inside the clear plastic.

An important point with LEDs is that the relationship between current drawn and voltage drop isn't linear as with a resistor - it behaves almost like a fixed voltage drop which will sink whatever current you try to put through it until it reaches its limit and burns out.

This means that it's much more sensitive to the applied voltage than a resistor, so you either need to drive it from a true current source (which you can make using transistors), or by using a resistor to regulate the current to the level you want.

The way to work out the resistance is as follows:

- before you start you need to know the voltage you're driving it from, the max current the LED can take, and the LED's characteristic voltage drop (between 1.5-4V depending on the colour). You can look this up in the data sheet if there is one or test it with a multimeter.

- What you're aiming at is for the circuit to stabilise at a point where the LED is dropping its characteristic voltage, and the remaining voltage drop from the supply is taken up by the resistor.

- So, you subtract the LED voltage from the supply voltage to get the voltage drop across the resistor.

- Now you need to choose a resistance which will drop that voltage at the current you want.

- The right value is given by Ohm's law: V = I * R, which you can rearrange to make R=V / I

- In other words, divide the voltage drop across the resistor by the current you want and you'll get the right answer!

(Incidentally this only applies for normal LEDs which are sold without a built in resistor - LEDs that are sold as 5v, 9v, 12v etc already have a resistor built in and don't need another one.)

Rick's answer is, indeed, very good. Here are a few other thoughts on how to think of what is happening. As Rick says, an LED is a semiconductor device. However, when he said it was an "active" device, I think he was thinking of the term "non-linear" device. A non-linear device is an electronic component where the voltage is not proportional to the current. (A resistor IS a linear device; if you double the voltage across it, for example, the current through it doubles.) So, again, the LED is non-linear. As you increase the voltage across it, nothing happens at first. No current flows. You increase the voltage - more - and more, and still nothing happens. Then suddenly, when you increase the voltage just a little bit more, a big current flows. The voltage at which this happens varies from LED to LED. It is often somewhere in the 3 volt to 4 volt range. The purpose of the resistor in series with the LED is to limit the current flowing through both of them. There are lots of posts that tell you how to do the mathematics to decide on what resistor to use. As many people have found, they can sometimes hook up an LED (or a chain of LEDs) to a battery without a resistor in series, an it works fine. This is because the battery voltage is not much more than the "threshold" voltage and because the internal resistance of the battery is enough to limit the current. In other words, there IS a resistor in the circuit, even though you can't see it. A typical current for small LEDs is about 10 to 20 mA. However, I played with some very bright ones that are rated for 350 mA. I pushed the current to 400 mA before I chickened out. They were too bright to look at.

If a device draws only its required amount of current from a supply.

If you struck a LED with lightning, would it only draw 20mA or would it vapourise?
Unless you know of current-regulated LEDs (?) what goes through them is related to how hard you "push" in Volts. You need to control current with LEDs as they've got little resistance, unlike filament bulbs for example.


There are, indeed, current-regulated LEDs. They have a minimum voltage and a maximum voltage, and will draw their rated current from any supply between those limits. They are really handy when driving a series string of LEDs; just put one constant-current unit in the string, and make sure that the supply voltage is high enough (CCLED minimum + (number of ordinary LEDs * forward drop of an ordinary LED). This is especially good for automotive applications, where the supply voltage can vary. Sometimes you can get away with operating an ordinary LED without a series resistor. The current will be limited by the internal resistance of the source, and the LED will get hot, increasing its forward drop. Not something to depend on unless you have detailed information about the LED and the battery.

It's because very few times can you run them at exactly the right voltage. I you look at an led calculator and plug in 3 volts power supply, 3 volt led, 25 ma it will show a resistor of 1 ohm. That is nothing. It doesn't even show up. A long piece of wire might have more resistance than that. It shows a resistance because the calculator has been programmed with one in there even if it's not needed.