Introduction: Adding a Dimmer to a Power LED Driver

I bought a 50W power LED with the associated driver (constant current) and wanted to add it a dimming function. This instructable is about the design choices to get to a working solution.

Nothing too complicated in this project, just one or two details about constant current driver and associated "high" voltage to manage properly.

The only real problem I face is that I want to build a circuit to switch the LED power supply OFF and ON (PWM signal) to dim the LED but I also want to use the constant current driver as a power supply for the circuit, which makes it a highly variable voltage supply.

The PWM circuit itself is based on a simple 555 IC building block (http://www.circuitsgallery.com/2013/02/pwm-led-dimmer.html) with a MOSFET transistor on the low side instead of the direct LED.

Step 1: Measure the Driver Max Voltage

As the power supply is from mains (220V/AC here) to a variable voltage constant current, I first have to find what voltage will be output from the supply at a maximum. As a constant current source, the driver will try to increase the output voltage as the resistance increases to maintain a 1350 mA current flow (U=RI, simple ohm's law), so when R is at maximum the voltage will also be at maximum. Of course, maximum R is obtained with an open circuit.

This means that the driver will increase the voltage to its maximum when the PWM signal goes to low and that the supply voltage will change from nominal to max voltage at the PWM frequency.

I measured about 42V CC when the driver was powered with no LED attached, and that will be the value I'll use for the regulation.

Step 2: Regulate Voltage for the Circuit

Now that I know the max voltage I can find a way to regulate the voltage for the circuit. Well, actually I just need it to stay under the 555 max Voltage (16V). 42V is also way too high for a 78xx regulator.

The PWM circuit is really low power: with a CMOS 555 driving a MOSFET gate it should be less than 1mA. A linear regulation should be OK even if the voltage drop is high. As I won't use any sensitive electronics here and I don't need a very accurate PWM/time constant, the regulation doesn't need to be highly accurate and stable, so a crude solution will be good.

Given these 2 points I settled on a 12V zener diode, but just to stay on the safe side I paired it with an NPN series pass transistor. It should give an output voltage of about 12-0.6=11.4V. I use a BD139 I had lying around, which has a max VCE/VCB voltage of 80V, more than enough for the 42V-12V=30V drop it will have to sustain. As for the power dissipation with the 1mA expected circuit consumption the NPN will have to dissipate at most 30V*1mA = 30mW, so no need for a heat sink. In fact, I measured an actual consumption way below 1mA (680uA from what I recall).

With a resistor of 10k the zener diode will be traversed by a current lower than 3 mA, which corresponds to the current going through the resistor (a drop of 30V at maximum, through a 10k resistor), without taking into account the current going though the transistor base... this will make it dissipate at most 12*3 = 36mW. I think the diode is rated for 125mW so that should be OK.

I wouldn't recommend this kind of crude regulation for any sensitive component such as micro-controllers or with a higher current, given the high voltage drop, but for this usage it's just fine.

Step 3: PWM Circuit

The PWM circuit chosen is really simple and quick to build. It has one drawback though: it doesn't go all the way from 0% to 100% (more like 1% to 99%).

Let's be clear, we are talking about a 50W white LED which I plan on using for indoor lighting. Way overkill, if you ask me, so I won't mourn the 1% loss of power on the high range. But on the other hand, 1% duty cycle minimum for a 50W is a lot of light for a supposedly "OFF" position. So I chose a potentiometer with an embedded switch, so that it can turn off the circuit completely when the potentiometer is on the low end. This is visible on the second schematic (the button).

Side note: in my other power LED projects I used the MOSFET with an NPN and a shunt resistor to make a basic constant current driver. This make the MOSFET a variable resistor which may dissipate a lot of power, thus often requiring a heatsink. Here the constant current is provided by the driver on the high side so the MOSFET is only used as a switch, either ON (22mOhms resistance) or OFF (infinite resistance). It will therefore dissipate very little energy (0.022*1350 = 29.7mW when it's ON, plus the ON/OFF transitions) and will not require a heatsink.

Step 4: Breadboard Test

These pictures show the test I've made on breadboard. We can see the driver on the left, the LED (not glued to the heat sink, my tests were run for a few seconds only, don't try this at home ;) ).

This instructable is only about the design and the steps to get to this solution, so I will stop here.

I suppose I'll solder it on a proto board and try to find/make a suitable box but that's another story.

Hope you like it and find it useful.