# why does an led have to be given a resistor before it

i checked many instrctables for it but couldnt find a proper explanation for my answer so i thought of asking it..

and people say that led should not be directly connected to any supply as it may blow or get damaged or burn..

but i have done that many times and also had made a torch for my uncle using two white leds of 3.5v( not sure but they are big) and used a 9 v battery without any resistor...it lasted till the battery drained but never blasted... and yes its in series..

and thanks in advance for answering me.. and sorry for bad image quality..used my phone

active| newest | oldestThe LED is almost a short circuit when connect so it will work - forwards biased - so if it can it will draw as much current as the battery or power supply can provide - a bad thing.

Often LEDS - especially high powered ones are operated via a constant current device to remove all these issues.

To operate the LED you need to limit the current to the recommended level to avoid destroying the device. this way you can very the supplied voltage a little and still keep the current within bounds.

Take the LED voltage away from the supply voltage thus leaving the voltage you need to drop across the resistor.

You now need to use Ohms law Volts=Current x Resistance to calculate the value of the resistor for a give current - the LED operating current. If an LED needs 3 volts and operates at 10mAmps and the supply is 9 volts then: 9-3=6 volts across the resistor. V=I x R can be changed round to be R=V / I So R=6 / .010 amp (10 milliamps.) = 600 ohms.

So a 600 ohm resistor at 6 volts will draw 10 milliamps. The current that passes through the resistor also passes through the LED so all is OK.

just wondering, if I put three 3.4V LED in series and I do this twice and connect both series groups to a single 3.4V LED , it seems to me that .. ( 3x3.4V = 10.2V ) the groups connected to a single LED bring the final voltage to 11.9 = 10.2v + (3.4V / 2 ) .

lets see if I can make example:

----3.4 ---- 3.4 ---- 3.4 ----|

|---- 3.4 -----

----3.4 ---- 3.4 ---- 3.4 ----|

I don't want my power dissipated on a resistor.. seems a waste.

20mA is the recommended optimal value, 5 amps is 250 times that much and will incinerate it quickly.

You must also take into account the forward voltage, or the recommended voltage of the LED. It probably wants between 2 and 2.5 volts to run most effectively. Any more and it will burn any less and it wont turn on.

Also, 5 amps is the maximum amount of current the power supply will give. Meaning if you short it you would get 5 amps, but you could get any amount of amps below 5 out of it with the right resistor

To find a resistor that will turn 3 volts at 5 amps into 2.5 volts at 20 mA, we use ohms law.

V=IR (Voltage equal the Current (in amps) times the resistance in Ohms)

we can rewrite that as R=V/I

In this case R= (3-2.5)/.02 (The voltage across the resistor is the total voltage minus the voltage drop of the LED, NOT the total voltage)

so R=.5/.02 = 25 ohms. Unfortunately you probably can't get a 25 ohm resistor so you must use a close value like 33 or 45 ohms.

The post below mostly covered this. If you want further information abotu Ohms law or LED resistor calculation check this site out

http://cuttingedgescienceclub.blogspot.com/2012/01/ohms-law-practice-problems.html

Apoorva

This means that it's much more sensitive to the applied voltage than a resistor, so you either need to drive it from a true current source (which you can make using transistors), or by using a resistor to regulate the current to the level you want.

The way to work out the resistance is as follows:

- before you start you need to know the voltage you're driving it from, the max current the LED can take, and the LED's characteristic voltage drop (between 1.5-4V depending on the colour). You can look this up in the data sheet if there is one or test it with a multimeter.

- What you're aiming at is for the circuit to stabilise at a point where the LED is dropping its characteristic voltage, and the remaining voltage drop from the supply is taken up by the resistor.

- So, you subtract the LED voltage from the supply voltage to get the voltage drop across the resistor.

- Now you need to choose a resistance which will drop that voltage at the current you want.

- The right value is given by Ohm's law: V = I * R, which you can rearrange to make R=V / I

- In other words, divide the voltage drop across the resistor by the current you want and you'll get the right answer!

(Incidentally this only applies for normal LEDs which are sold without a built in resistor - LEDs that are sold as 5v, 9v, 12v etc already have a resistor built in and don't need another one.)

Ifa device draws only its required amount of current from a supply.If you struck a LED with lightning, would it only draw 20mA or would it vapourise?

Unless you know of current-regulated LEDs (?) what goes through them is related to how hard you "push" in Volts. You need to control current with LEDs as they've got little resistance, unlike filament bulbs for example.

L