Author Options:

Why waste the resistors? Answered

So I see a lot of projects with like 60 LEDs of the same type or something like that, and they use a separate resistor for each LED. Why do they do this? Why not just get one resistor and hook it up to ground or VCC? Please explain, thanks!


LEDs have what is called a positive temperature coefficient. If they get hotter, they pass more current. In a parallel chain, one will get hotter and steal current from the others,which will get cooler, and give more current to the hotter LED which will get hotter......

Ah, steve, that makes sense. Thanks bwrusell and skinnerz!

I bought a bunch of cheap multi-LED camping lights once.

I actually observed the process happening, because they hadn't used resistors. First one LED went brighter and whiter than the rest...then it went orange, as it burned out ! Then another little LED started, and in less than two minutes every LED burned.

Wow, that's terrible!
So what about a series circuit? How should you arrange the LEDs?

one resistor per LED. The only way, if you only have a battery that can handle the forward voltage of one LED..

For LEDs in series, a single resistor will work, as the current flowing through each LED is the same. However for parallel LEDs, a single resistor would only limit the total current through the entire set of LEDs, so individual LEDs can still draw too much current and burn out.

In some cases, where there are large numbers of LEDs, or high power LEDs are used, a single resistor would have to dissipate a lot of heat, so it is often more practical to spread the heat dissipation across more smaller resistors.

The best balance between few components and circuit stability is probably a series/parallel array where you have take a series of LEDs (# determined by input voltage) with one resistor and put multiple copies of that series in parallel, if that makes sense. THIS wizard is a good page to bookmark as it makes determining the # of LEDs and size of resistor in each series string pretty easy.