if you undervoltage and LED (by like .05 volts) you don't need a resistor, right?
It will not emit light ...
...but you are correct. If you use the LED's drop voltage or less, you do not need a resistor. It will be dimmer the lower voltage you use. As low as a half volt, it probably wouldn't light as choose pointer out.
Sorry for the somewhat noob question. Is an LED's voltage drop a fixed figure, or a min-max range?
I'm aware that there is a minimum voltage before an LED starts emitting and a maximum voltage it can safely take.
So the 2V (based on choose's diagram) can be looked at as:
1) minimum voltage before it starts emitting?
2) or the recommended optimal voltage?
3) or the maximum voltage it can safely take?
The voltage drop of an LED is the voltage across the LED at it's optimal current. This is specified in the specs of the LED being used.
Choose's diagram above is assuming a 2v drop at 20mA. This is not typical but makes the math easier to explain.
Yeah, I assumed 2V, 20 mA and a "red" color for 3mm or 5mm LED as example =o)I took these data from my memory ... And if my memory is still good, the voltage for a such LED may vary from 1.8V to 2.1V @ 20 mA.The voltage is more or less than 2V according to the color of the LED.Usually (still according to my memory) the voltage of a LED will "follow" the color spectrum :IR : 1.6Vred : 1.8V to 2.1Vorange : 2.2Vyellow : 2.4Vgreen : 2.6VBlue : about 3VWhite : about 3VUV : i don't know, but should be greater than 3V