Most LED's are pretty bright with about 20 milliamps (that's .020 amps) flowing through them (check you data sheet for your specific LED). So.... 12v - 3.3v =8.7v 8.7v/.020= 435 ohms Now... you can light an LED with less current, you just have to experiment. It depends on the LED you are using. The lower the resistance the higher the current will be. Going with a standard 1k ohm resister will get you about 8.7 milliamps. The LED should work just fine with that. Good luck.
Select as Best AnswerUndo Best Answer
@fthies can you plz tell me what formula have you used...8.7/.020??
(Supply voltage - LED voltage drop) devided by desired current in amps (20 milliamperes = .020 amps)= resistance needed to limit the current through the LED. So for a 12volt system and a 2 volt drop LED the formula would be (12-2)/.020= 10v / .020= 50 ohms.
The power ratting for the resistor is calculated with this formula. Voltage (10v) multiplied by the current through it (.020). Which is (10v * .020a=.2w watts) so a 1/4 watt resistor would get pretty hot. I would error on the high side and go with a 1/2 watt resistor.
led.linear1.org/1led.wiz will calculate for you and explain what you need to do.
Use a 1 K ohm. That's: brown, black, red. I like powers of 10. I like the brown right next to the black. I think it's a very attractive arrangement of colors. I think you'll be happy with it too.
Go here > led.linear1.org/1led.wiz You already have the source voltage (12v) and the diode voltage drop (3.3v). Just find out the current in mA and you'll get your resistor value!
What power / mA is the LED good for? The battery will "fry" that very quickly indeed, the current needs limiting, but to what? L