Author Options:

Does this look right?? Answered

I want to link a few LEDs together and use the mains as the power supply (not batteries). Does this look ok? Any risk by linking them up this way? I want to use a 5V power supply Thanks


There's nothing technically wrong with your circuit, but it's going to run the battery dead rather quickly. The purpose of the resistor is to limit the current through the LED so you're not passing more current than the LED really needs resulting in short battery life and heat from the LED, needless to say, a shorter life for the LED as well. Attached I've revised your drawing as examples... In the top drawing, were driving one LED at 20mA. This is quite simple, we just place a 250 Ohm resistor in series with the supply and this gives us 5/250 Amps or 20mA. We can disregard the voltage drop of the LED since it's minimal. The center drawing is yours with values attached. Since we want 20mA across each LED, we need to use 250 Ohm resistors for each. Kirchoff's law tells us that the total current going into a circuit equals the total current leaving the circuit. This gives us 200mA, which, while it'll work, it pulls more power than we need... The bottom drawing is using one resistor of 120 Ohms supplying the LEDs with about 42mA, split to 21mA between the two rows. Taking into account some voltage drop through the LEDs it should be supplying about 20mA through them and 40mA total for the circuit, saving power and resistor costs. I hope this helps...

Picture 2.png

Good comment. The voltage drop across an LED can actually be computed from the wavelength: from the wavelength (~750nm for red) compute the frequency (f=c/lambda), use Planck's formula to compute the energy of a photon (E=hf) and divide this by the charge of one electron to get the voltage drop. For a red LED this should be around ~1.4V. For other LEDs, it is higher.
So having 5 LEDs in series on 5 volts will not make light. You can have three at most.
You should work out the resistor as follows: compute the number of LED's in series (say 3), multiply by the voltage drop: 1.4V*3=4.2V. Calculate the difference to the voltage source: 5V-4.2V=0.8V. Divide it by the desired current (20mA): 0.8V/20mA=40Ohms.

Now there is one difficult thing: LEDs are highly non-linear implying that an LED might not give any light at say 1.3 V but at 1.6 V it already might heat up and die. That is why you should have fewer LEDs in series and a higher resistor. But as already pointed out, this will be inefficient as a lot of energy is just wasted into heat at the resistor. So this is the empirical part: if you see that your LEDs are short-lived, pick a different circuit.

If you want to gain fine-control of the brightness, you best use a constant current circuit. It is (because of non-linearity) easier to control the brightness by the current then by the voltage. Ideally, the current is directly proportional to the number of photons emitted per time unit.

Hope that helps.

Wow! That's useful info. I find it a bit easier to look at the spec sheet of the LED itself. I didn't know there was that much voltage drop though. About the only times I use LEDs are as indicators.

That helps loads! thanks But like trialex says, I don't intend to use batteries. Would you recommend I use the bottom diagram? In the bottom diagram, does the -ve leg of on LED attach to the +ve leg of the next? I tried this with 2 LEDs but it failed to work.

Yes. each row of LEDs are connected is series: minus to plus, minus to plus, etc. ...while the two rows are connected in parallel: minus of both strings to ground and positive of both strings to the resisitor. If any are connected in reverse, the circuit will not work properly.

To be fair he/she does say they are using a 5V supply rather than bateries, but always a good idea to use the least power necessary