Author Options:

How do I calculate an LED power supply? Answered

 Okay, I don't know a lot about LED's but here's what I've done so far.
I bought some 12v LED strips and used them to surround a display in my shop. I soldered all the strips together and attached my bench supply.  The LED's seem pretty forgiving and I can run them at a reasonable brightness at anything between 8.3 and 12 volts.
8.3 volts draws 0.065 amps and 12 volts draws 0.65 volts.
I have no idea why it seems to draw less current at a lower voltage, unless my ammeter is faulty on my PSU.
I am now wanting to attach a power adapter. I have a 12v 500mA supply but am worried it won't cope.
I also have a 12v 1.25 amp supply but am guessing this may up the voltage and burn out the LED's.
Do I really need to buy a 12v LED driver or can I get away with a cheap non-switching adapter?  And if so do I need to go for something like a 12V 650mA?



Best Answer 8 years ago

Voltage is like pressure, consider it the steepness of a waterfall (potential difference, voltage).  The steeper the gradient, the faster the water will flow (overall flowrate = amps)

More pressure = more overall flow, hence at a higher voltage it draws more current, or inversely, the resistors allow more current through at the higher voltage, hence the leds glow more brightly (since they see more current).

That aside, if it draws .65 amps, thats 650 miliamps, and will be too much.  If it draws .065, thats 65 miliamps..check your spelling :D

Conclusion:  if its 650mA required, go with something capable of 1000mA++.  Running a supply at or near its rated tolerance will shorten its lifespan.  I'm not saying get an uuber 20 amp monster power supply and run it at 5% of its capacity, but get something with a little headroom over the requirements - it will run more efficiently and longevity that way.

Edit:  I just saw your reply to steve...I understand its 650 now.

restating:  Get one rated for at least an amp.

your problem is simple all you need to do is use this equation
power= voltage times amps(or current) then divide by 1000

try using ohm's law: V=IR. your 12v LED = V. Current you want to limit it to (say 80 mA = I. and R, we want to find that out. So 12V/0.03A= 400 ohms. I'm only 13 so please correct me if wrong.

Do you mean 0.65 AMPS at 12 V ? 

Your strips must have their own resistors inside, so, provided your supply can supply 12 V at AT LEAST 650mA you'll be fine. The output voltage may be a little bit more than 12V on load, but I doubt it'll be too bad. Try it and measure it.


 But if I use a 12v 1.25 amp PSU (the one I have is an old one for a scanner)
will it supply a higher voltage as it will be drawing half the current? I don't want to blow my new display.

It depends on what kind of power supply it is, if its UNregulated, the output WILL be a bit higher than you expect, but if its half-decent (read:heavy), it won't more than about 5-10% higher which will be OK.  Measure it. It won't damage it if you run it for a couple of seconds while you do a sanity check with the meter.

What do the makers give as a spec ? 

If you were to run it off a "12V" car battery, the terminal voltage of them can be up to 13.6 volts, so its not really expecting to see 12.00000V


 Thanks Steve. I'll do that. As you say, it won't take long for me to measure the voltage.