loading

do resistors use up power?

i.e. is a circuit with resistors much less efficient than having the correct voltage/current from the course? i'm thinking here mainly about dimmer switches and also parallel LEDs where each would need its own resistor. Cheers.?

sort by: active | newest | oldest
AndyGadget7 years ago
A resistor will generate heat whenever a current flows through it, so the more voltage you're dropping across it, the more power it will waste as heat. The formula is P= I2 x R (the current squared times the resistance) or V x I (the voltage across it times the current through it).
jamesjamesjames (author) 7 years ago
cheers guys :) Andy, love the tictactunes!
thinkdunson7 years ago
also remember that, as well as parallel, you could put some LED's in series. depending on your source, put enough LED's in series to sink almost that much voltage, then put groups of those series LED's in parallel. you'd only need one resistor (for current management as frollard says, for the safety of the LED's) in series with each group. less resistance, less waste. here's an illustration i whipped up in case i suck at explaining.
LED series-parallel.jpg
frollard7 years ago
ideally you want the voltage source very close to the led source - so you only need a very small resistor. Even when they're matched, you still need a resistor (say 1 ohm) to make sure that small fluctuations in manufacture of the leds mean each one draws the same current. as Andy says - the heat dissipated is the square of the current - so you want the right current, or a small fluctuation will cook you.
Jayefuu7 years ago
Andy is right. With respect to dimming with LEDs, you should look into PWM (pulse width modulation). Googling it will give you lots of guides on it.