Author Options:

Determining LED Voltage... Answered

So I've just scrapped a bunch of LED's (some superbrights, some not) from various places (motherboards, computer cases, other electronics) and have no idea what voltage any of them are. I can't go back and test the sources to see how much they (the LEDs) were being supplied (most sources are at the landfill by now), so I need a cheap, effective way of determining rough voltage. I do realize the standard for many LEDs is either 1.8, 2.2, 2.4, and 3.2 V, but this is still too much variance for me to want to burn out all these scrapped LEDs. Thanks, Daniel.


Thanks for the reply! If I use a 3V power supply (2 AA's), and my LED is not lighting up (disregarding that I may have polarity backwards), is it safe to assume that the LED requires a higher voltage, like 3.2V? Also, if i apply >4V to an "average LED" in reverse polarity, will i burn it out? (Some of the LEDs I clipped have the same length leads, and no other markings from what I can tell.)

Most LEDs have an edge, where the lip at the base seems shaved off. This is normally the cathode (-) side of the LED. If you don't apply too much current, reverse biasing shouldn't hurt the LED.


11 years ago

LEDs are current driven devices, so there exact voltage isn't so important. Start with a supply/resistor that will yield 20mA through a 2V "average LED", and you won't go too far over on anything, and everything should light given a >4V supply.
(so, (5V-2V)/20mA = 150ohm, (12V-2V)/20mA = 500ohm, (9V-2V)/20mA= 350ohm)