81Views5Replies

Author Options:

How to use LEDs for the first time? Answered

I'm making a display case as a gift and I wanted to add a couple of LED lights with a simple switch to turn on/off.  Doesn't need to be super bright or too fancy.  The case will be about 12" long x 5" deep x 5-6" high overall.  Can anyone direct me to a good site to learn the basics of working with LEDs?  Like how to wire them up and put in the switch?  Thank you!

Comments

The forums are retiring in 2021 and are now closed for new topics and comments.
0
seandogue
seandogue

6 years ago

The only rules I can think of worth remembering are these.

Given a voltage source with output Vs, and an LED with a forward voltage of Vf and maximum operating current of If, (and presuming one wants to operate the LED at it's maximum value) one calculates the voltage dropping resistor of value as follows:

R = (Vs - Vf)/If

If you want to keep the forward current blow the maximum, (which is usually a good idea for two reasons) make the resistor larger than the value calculated.

reason 1) durability. Operating electornics items at their rated maxijums is a good way to wear out the component relatively quciikly.

reason 2) Pushing to the "edgae" is a good way to make a mistake and enfd up exceeding the max value, which results in immediate failure of said component

0
iceng
iceng

Answer 6 years ago

+1

0
mrandle
mrandle

6 years ago

Tons of stuff here on Instructables. I would take a trip to the dollar store and buy something cheap with LED's to reverse engineer or play with first. I would recommend LED strips as well they are pretty much made for using in display cases, can be cut to length, and are pretty idiot proof. I used them in one of my instructables https://www.instructables.com/id/Parking-Meter-Lamp/.

0
-max-
-max-

6 years ago

You can power LEDs with any constant current source that supplies the correct current (10mA for standard 3mm-5mm LEDs, 40mA for bright white LEDs, >100mA for power LEDs, etc.)

Powering LEDs directly from the correct voltage source is a bit more risky, as even the smallest changes in voltage will cause large changes in current! To fix this, a small resistor is needed to limit current. To figure out what value, just do this.

Say, you want to power a 5mm blue LED. They need about 3.3V to work brightly, at about 30mA. For something different, look at a IV chart (http://www.elecfans.com/article/UploadPic/2009-5/2...) in the datasheet for your LEDs.

Knowing what current they need, see what the corresponding voltage is, and assume the LED will always drop that voltage when in series with a resistor. Then, calculate the voltage across the resistor (battery voltage - LED drop) and use ohms law to find a resistor that will allow just the right amount of current to flow. Or you could cheat and have the internet do it for you!

So here is how to do it: say you have a 5V supply, but a LED that needs 3.3V @ 20mA for the most light. you would first subtract 3.3V from 5V, so we know the resistor has to drop 1.7V when 20mA of current flows through it. Since V = IR, crunch the numbers. 1.7V = 0.02A * R ; 1.7 / 0.02 = R ; R = 85 ohms.

This is the minimum resistance needed. However, it is best to round up 100 ohms and sacrifice some current rather than round down and burn out the LED.

This is exactly what http://led.linear1.org/1led.wiz does. It also returns 100 ohms!

0
iceng
iceng

6 years ago

Do u plan to plug it into a wall socket or change batteries ?