12530Views19Replies

# do i need to have resistors for leds? Answered

im building a circuit with 12v leds and wondered if i need to have resistors in the circuit

Tags:

## Discussions

You should at least have a 1ohm resistor.. it's the current that matters, not the voltage. LED only uses as much voltage as it wants.. but increase or decrease the current, and it gets brighter or dimmer.
http://led.linear1.org/led.wiz

it depends on the power supply, if it is 3 volts and the led is so, then no bui if the power supply's voltage is higher, then yes.

It honestly depends on the application and from there, possibly your preference. If it is something that needs to be reliable and moderately long-lived, then yes. If it is something that you're doing just for fun and you don't care how long the LED's last, then no---as long as the current is small enough, like from a drive pin on a uC or a USB port (tho the USB port supplies 500 mA, and most LED's operate on 20 mA or so).

usb port can blast led to pieces (learned that the hard way)

Yeah, but if you happen to be using a higher power LED (such as a Luxeon), then it won't. 1 Watt Lux's draw about 350 mA of Current, 3 Watt Lux's draw about 1 A.

If you really want to blow up a LED though, the best way is to hook it up to some AC power....24V @ 60Hz + 15 LEDs in Parallel = an interesting explosion (And note: it was deliberate)

all kinds of leds need current limitation maybe in luxeon its built in as a rule for high power leds pwm (555) is better than resistors

It's not built in, (and yes, PWM is better), however, the USB port cannot provide more than 500 mA in most computers, with a 1 Watt Lux, (Note: Greeen/Blue/Yellow = ~465 mA ; Red = ~350 mA) that amount of current is not enough to cause serious harm instantaneously. It will shorten the life of the LED, but it won't cause them to blow up.

The 3 Watt Lux's will optimally want to draw twice as much current as the USB port can provide. Since the USB port can only provide 500 mA, the LED will be dimmer.

LEDs are current-dependent devices. So as a general rule, it is best to use some sort of current limiting device. However, if the source can only provide up to a certain amount of current...the source is the current limiting factor.

Also, PWM is not always available, and sometimes is more of a pain in the arse to implement in one-off's or proof-of-concept prototypes (especially if it will cost more in gas than in parts to get them). Provided the power dissipation of a resistor is great enough to dissipate the power generated from the current use (P = VI, P = I2 * R) then a resistor is acceptable. (See The Custom Saber Shop for more details.)

the usb is meant to supply 500 mA (actually 100 mA with option go up to 500 mA) in reality each 2 usb ports (that are kinda together) are connected together thru 1 A fuse. with the other 1 unused you can get 1 A and possibly more for short time if you pull too much current you blow the fuse. thn you have to replace it in some computers there is something that looks more like thermistor than fuse. if it really is then you sure have more than 1 A for some time in some there is no protection at all - a 0 ohm smt resistor that probably can stand well above 1 A

No you dont need a resistor. You could use a current limiting circuit, but a resistor would be best unless you are using high power leds.

it would be better for the led when you use a resistor, but its quite ok if you don't use it when you put them in series and or when the voltage isn't higher then 2,5 V

I have connected LED's in series for multiple voltages without resistors,and have gotten away with it.But you should use resistors --/\/\/--.The only exception to this is PWM and 12V or 5V LED's with internal resistor installed at production.

yes the LEDS are 12v but i am going to use 9v's for power

If you have a 12v battery and 12v LEDs then no. Also if you are hooking up more than one LED remember to hook them up in parallel. (all LED's anode leads connected to positive teminal on battery and cathodes to negative versus cathode lead of first led to anode lead on second and so on)

Forgot to add that if you hook them up in series(not parallel) then they wont get enough voltage(in series:2 leds hooked up to a 12v battery will give only 6v to each, in parallel: they will both get 12v but will drain battery twice as fast). Correct me if im wrong.

thets correct for factory - made led modules assembled with internal resistor and designed for 12 V there are also leds with built in resistor but they are very rare for just leds you need a resistor

It is a good idea to always use one but it all depends on the intensity of the L.E.D. and how much voltage the battery has. Most common L.E.D.s use about 2 volts.

Yea , you need a resistor...connecting in parallel or series, you need a resistor...

Yup. Even when current is low I would, just in case...and it is good practice. Basically if (when) voltage wobbles up and down the led draws more current. Problem there is; LED's are semiconductors, and as they draw a little more voltage they tend to draw a lot more current, dissipating the extra power as heat. As they get warmer they conduct less and draw more current, getting hotter and eventually burning out. Resistors prevent this run away action because they have a linear relationship with voltage and current. A spike in voltage causes a proportional increase in current dissipation, keeping the LED relatively stable.

Similarly, LED's cannot share a resistor because the current consumption of one LED in the group will run higher, causing it to get hotter and burning out. There are a few calculators out there for LED current limiting resistors, and failing that the math is posted in quite a few places too.
Hope this helps,
Drew

you usually have to but if the current is low enough you dont really need them- i dont anyway