Author Options:

Run a few 1 watt LEDs using transistor from 3.7v ? Answered

hi , I want to control 5 X 1W leds using a LDR.
It will be running on a 3.7v 4500mAh battery .
I have the LDR circuit.

The problem I face is , normal transistor are not capable of running 5W leds (1W X 5) , it just heats up.
i tried some high power transistors , they seem to provide less voltage to the leds.

I need a transistor , which can be used as a switch , it should work well in 3 v , should be able to handle max 5W.

Any one have any idea ?


My idea was to create a simple "portable lamp" which will turn on automatically when the power or light goes off.

You guys are talking about much advanced levels.

I have frequent power failures here in my place, and I am running 5 X 1 W LEDs directly from a 3.7V 2000mAh battery as emergency lamps.

All components attached to a small heatsink.

It works continuously for about 1 hour and I have been using it for 4 months.

Non of the LEDs are dead (yet).

Here is a quick block diagram/schematic of a circuit I designed that will do what you want it to do:


* Power LED(s) when AC mains power is lost.

* Must be operational for 1 hour continuously.

* Must be portable


* ???? give me a few of these (will you be willing to buy a few 2000mAH cells, different LEDs (like one 10W cree XML2 led), etc. Without a few of these, I can't do much better than a simple block diagram since there are so many different routes that can be taken.)


Assuming you drive the LEDs for 1 hour continuously at 5W, then you will need 5W * 1H = 5WH. Your battery is rated for about 8.14WH, and assuming (guestimating) a loss of 40% as heat and inefficiency (60% efficiency), there will be 4.88WH of useable energy. A little low, and your LEDs may not go the full hour, but close enough. You definitely want to make sure the LED driver is more than or at least 60% efficient in your application.

For the "lithium battery charger" just find a decent linear charger, and charge the battery at 500mA-1000mA. https://www.youtube.com/watch?v=A6mKd5_-abk Watch this if you want to learn more about choosing the right type.

Digikey has a wide selection: http://www.digikey.com/product-search/en?pv412=31&pv412=3&pv412=36&pv412=6&FV=fff40027%2Cfff801b0%2C1140050&k=charger&mnonly=0&newproducts=0&ColumnSort=0&page=1&quantity=0&ptm=0&fid=0&pageSize=25

blackout-sensitive LED driver.png

You were not turning the transistor on properly What circuit did you use ? What forward volt drop are the LEDs ?

Thanks for the reply.

LED specs says : DC Forward Current: 350mA

Circuit doesn't matter , because I am not asking help for LDR circuit.

I just want to use a transistor which will work in 3 to 3.7 V as a switch for 5 X 1 watt LEDs!

I am attaching a circuit example , which will explain my question further more.


That circuit will destroy your LEDs if you ever succeed in turning them on properly. You have several problems, first you have no current limiting on the LEDs, second, you haven't got enough voltage to drive the LEDs, third, you haven't got a way of turning the transistor on properly.

If I assume these are 3.7V LEds, when they are on, they should be run on 350mA, You should put them in series, so with 5 you need 5 x 3.7 - 18.5 volts + a bit more, say another 5, for current regulation minimum. To current limit them then you need a 5/0.35 15Ohm 2Watt resistor - which will get pretty hot.

If you insist on parallel operation, I recommend a minimum of 6V, leaving you with 5 1W resistors, say 8.2 Ohms. - wasting 5W.

The transistor should be a nice low resistance Mosfet, like the IRF540 or something like that, but make sure your LDR circuit can snap the transistor on and off, and not push it into linear operation.

Thanks for the reply steveastrouk, I will keep these ideas in mind.

Leds work @ Min:3.0V Max: 3.6V

I used to run these LEDs (4 of them) directly from 3.7V 5000mAh for more than 1 hour daily .

It works without any problem .I am using good heatsink :-)

and the transisitor , I can connect any other switching circuit to it , so that it will work.

I will use the resisitors , but it will waste a lot of power....

It is as mal-practice to have LEDs run like that. Since LEDs are heavily non-linear, they are *very* sensitive to voltage. the smallest change in it will dramatically change the current through them. Just half a volt can be the difference from nice bright LEDs to the magic smoke released!

Here is a IV curve for a diode: http://static.newworldencyclopedia.org/e/ec/Rectifier_vi_curve.GIF

Also, not the the maximum voltage on the batteries is 4.2V, and the minimum voltage without damaging the battery is 3.0V.

A small resistor should be in series with each LED. Otherwise, some LEDs will light up brighter than others, some may not even light up! this is because of the tolerance between each LED. Each LED may need a slightly different voltage to light up the same brightness. If the tolerance is too loose, then you also run the risk of frying some of the brighter LEDs.

Instead of just a transistor to turn ON and OFF and a bunch of lossy resistors, you could use a 0.35A constant current sink or source (switch mode type) connected to each LED individually. This will ensure the LEDs have the same current, and the brightness differences will not be noticeable. this will require a lot of parts, including 5 switch mode regulators, and all the necessary inductors, capacitors, resistors, diodes and other supporting circuitry to go along with that. It will be expensive, but work well. If you want to go through all that trouble, then use a proper switch mode LED driver for each LED.

Otherwise, just use a series resistor for each LED, and a BJT wired up with the common emitter configuration. To calculate the resistance, use ohms law, and take into account the voltage drop of the LEDs and the BJT. Just subtract 0.6V for the BJT drop and 3.6V for the LED drop.


(V_battery - V_led_drop - V_bjt_drop) = I*R

(4.2V - 3.6V - 0.6V) = 0.35A * R

solve for R:

0V / 0.35A = 0 ohms.

So although technically no resistor is required, to balance the LEDs, add a 0.5 ohm resistor anyway. working backwards to figure out the voltage on the LEDs, we see that:


(V_battery - V_led_drop - V_bjt_drop) = I*R

(V_battery - V_led_drop - 0.6V) = 0.35A * 0.5 ohms

V_battery - V_led_drop - 0.6V = 0.175

V_battery - V_led_drop = 0.175 + 0.6V

V_battery - V_led_drop = 0.775

0 - V_led_drop = 0.775 - V_battery

V_led_drop = V_battery - 0.775

So, here if the battery is fully charged, the V_battery = 4.2V, the LED will always be ~0.78 volts less. that means the minimum voltage on the battery must be 3.78V, which is not very good. There will still be a lot of useable energy left in that 'dead' cell.

Conclusion: You need a higher voltage and larger resistances and/or a proper LED driver.

Whilst you're largely right, putting all the LEDs in series means you would only need ONE switcher.

Its not a good idea to drop so little across the resistor, if you are intending to get "constant" current.

Well, the resistor is not so much for a constant current, as much as it is there to simply help balance the LEDs. The voltage should be higher, and the resistance higher as well to help keep the LEDs running the same brightness. The resistors act like really crude constant current sinks/sources.

A slightly better method may involve using this constant current sink on each LED.


Also, if you get a driving voltage of 5V or higher, you can use a MOSFET, which instead of having a voltage drop, it has ON state resistance. The math is slightly different since it is resistance instead of voltage drop. I consider a MOSFET a better choice. I think a 3V-4.2V to 5V boost converter would work well, then use a MOSFET as the switcher. Boost converters usually cannot deliver very high currents so the LEDs can be slightly protected.

I wonder though, if you do plan to use a boost converter, then you can wire all the LEDs in series and the current requirement will be lower. The voltage requirement will be 15-18V, and if the boost converter does not go higher than 18.0V, no resistor is needed. Better yet, get a booster which has a settable current-limit, and set it to 350mA. Each LED will have the exact same current flow but not necessarily the same voltage across it.

You can readily buy a current controlled LED buck driver, if you go the switcher route, why mess about ?

Using a mosfet makes no difference to the final math. The mosfet (which is being used in its linear mode, so doesn't have an "on" resistance.

Boost converters can deliver whatever they're designed to. They are often slightly less efficient than buck mode, because there are higher currents in everything.

According to every MOSFET datasheet I have seen, there is a ON state resistance for MOSFETs (at least for enhancement mode N channel types) even if the gate is fully saturated. It is usually something negligible (around 100 milliOhms) for example, the small TO92 2N7000 MOSFET has a "static on-state resistance" of ~1.5 ohms @ a Vgs of 10V and a drain current of 500mA. larger MOSFETS which handle higher currents, like the IRF540 will have lower ON state resistance.

And yes, a proper LED driver would definitely work, I am just giving as many possible solutions possible, as I do not know many of the requirements and restrictions/limitations.

As you stated, and I proved with the math above, with one lithium cell (which I consider a limit) voltage cannot be sacrificed. Dropout voltage will likely be too great and the cell's charge will not be adequately used. In this case, I currently think the best solution is a voltage-current regulated boost converter w/ LEDs in series. Of course that assumes that the circuit has to be powered by only one lithium cell.

I thought you were talking about the linear current limit circuit you cited. If you are using them purely as a switch, then yes, we can consider them to have an "on" resistance.

Thank you for your reply friend.

It is very detailed and useful.

i will first try to increase the input volt and apply your suggestions.

Thanks again.

The LEDs aren't turning on fully on that supply, since the transistor is dropping a volt or so on its own.

Putting LEDs in parallel is very bad practice, and may die without much notice. Its nothing to do with the heatsinking.