Author Options:

How can I get a suitable voltage from a 5V power supply? Answered

Hi all,

I've got a laser diode, with the following specifications:

Power Output CW 500mW
Working Current: I<350mA
Working Voltage: 2.2V"

And a laser diode driver:

"1. Constant current Stable voltage IC circuit, laser diode can be better protected without being damaged
2.Constant current output , Can be adjusted(0~580mA)
3.DC supply voltage input 3.0~4.2V.
4.Suitable for 808nm 100mw~500mW laser diode."

The combination of these two things will be switched on and off by an Arduino (Probably using a relay, as the driver board has a switch connection) - This may change but I don't imagine this will matter.


I noticed that the 'DC supply voltage input' says 3.0-4.2V . My power supply will be 5V (I don't imagine that Amps matter but if they do, it has a max output of 34A). What would the best/safest way to do this be? I don't want to burn out my laser because they're comparatively expensive, and I'd like it to be able to run relatively continuously (Probably multiple-minute continuous usage)

(For those of you wondering, I'm one of the many people making a laser engraver/cutter from scratch)


I think the tricky part is going to be setting up your constant current regulator, and adjusting it to the current you want.

Note that your expensive laser diode is NOT part of the circuit while you are adjusting the regulator.

For adjusting the regulator, use some other load in place of the laser diode, one that will dissipate a similar amount of power at the desired current level.

Supposing the desired constant current is 200 mA = 0.2 A, a 10 ohm resistor with that much current flowing through it would give a voltage drop of 2.0 volts, since (0.2A)*(10 ohm) = 2.0 V, and that's approximately the same voltage your laser diode would have across it while it is lasing.

Also I suggest using a 1 ohm resistor, in series, as a current sensing resistor. That is to say, you put the probes of your voltmeter across this 1 ohm resistor, and use that as your method for measuring current. Every 100mV of voltage across this 1 ohm resistor corresponds to (you guessed it) 100mA of current through the resistor. E.g. if the desired constant current is 200 mA, then the 1 ohm current sense resistor should have a voltage of exactly 200 mV = 0.2 V across it.

Then, in series with the 1 ohm current sensing resistor, you put your 10 ohm load resistor in series with it and the constant current regulator. Then adjust the constant current regulator, however you do that, by turning a pot or whatever, until you get the the current you want.

Then confirm the regulator is actually working by changing the size of the load. For example if you short that 10 ohm resistor with another 10 ohm resistor (effectively changing it to a 5 ohm resistor), then the regulator should respond instantly in such a way as to keep the current the same. E.g the voltage across those two parallel 10 ohm resistors will instantly drop to half the value that was previously across just one of them.

Upon request, I'll draw you a picture of this test setup, if that'll make it more clear what I'm describing here.

The point is, you want to get your constant current regulator adjusted to the right current first, before connecting your laser diode to it.

It doesn't answer my question, but thanks for this, it's probably something I would have failed to do. You just probably saved me money.

Can I just use a multimeter's 'ammeter' function while testing the unit, or will that introduce some error?

To be on the safe side, I'd aim for close to 300mA (I assume that 50mA under the recommended limit is enough? Or should I go for 200mA?), that means I would need (to make) a 7.5ohm resistor ([2x10Ohm in parallel] in series with [4x10ohm in parallel]). According to 'my' (Wolfram's) calculations, that should give 293.3mA with 2.2V.

Does that sound correct?

multimeters with with a shunt resistor and introduce whats called a 'burden voltage.' Also remember ammeters go in series with the load. If you have a separate voltmeter, measure the voltage directly across the load (laser driver) instead of the power supply.

Regarding the size of the resistor for the test load, your math looks good to me. Although I think the important part of the test is confirming the current stays regulated, even when the load changes. I mean if the regulator can supply 0.3 A across a 10 ohm resistor (at 3.0 V), and also supply 0.3 A across a 5 ohm resistor at 1.5 V, I am guessing it will comfortably supply 0.3 A, at all the points in between 3.0 V and 1.5 V. So the same test I mentioned before, 10 ohms and then 5 ohms, should work too.

By the way, the reason I was sort of imagining the test as starting with some big resistor, and then quickly making it smaller, is to prove the regulator can quickly decrease the voltage it is supplying. I am imagining the actual laser diode will require slightly less voltage, as it warms up, for to maintain the same level of current.

Ideally this is what the constant current regulator does, it watches the current, and it gives the load whatever voltage it wants, erm needs, (within some finite range, e.g. 0 to 5V) for to keep the current constant.

The mulitmeter's ammeter function can be used to measure current. However, doing it that way, it is necessary to place the meter in series with the load, and disconnecting the meter interrupts the current, because the current must necessarily flow through the meter.

I was thinking the a 1 ohm current sensing resistor would be more convenient. In fact I was thinking it would be convenient to leave it in place as a permanent part of the circuit. Also, if one side of that 1 ohm resistor was connected to ground, then you could maybe even measure across the voltage across it using a spare Arduino input.

By the way, leaving the current sense resistor in place, changes things slightly. I mean, instead of a laser diode, you have the laser diode plus the voltage drop across the current sense resistor, which could be as large as 0.3 V, at 300mA, and which corresponds to an extra 0.3*0.3 = 0.09 W = 90 mW of power dissipation.

But here I am naively assuming the constant current regulator does not care about that extra 0.3 volts, or the extra 90 mW of loading.

Of course I was also naively assuming the regulator would not care about having its input at 5.0 V, which I guess is outside of its specified range of 3.0 to 4.2V. So everybody else replying to this query says put a diode or two, in series with it, to drop the voltage from 5, to 5 minus one or two diode drops, and I guess that's good advice.

It would be more clear if I knew more about this constant current regulator. I mean you did not mention if it was linear, or switching, or what it is.

By the way, if you are interested in building your own constant current regulator, for use with a 5 volt supply, I think this instructable,


had some believable circuits for that. In particular I am thinking of the circuit shown in step 6, the one using a FET and a BJT. Actually, I found that one via a picture of a circuit posted from this page,


but it is convenient to see it in the context of the instructables 'ible it is a part of, an 'ible about linear-style, homemade, constant current regulators for driving LEDs.

Also step 8 of the same 'ible shows how to turn this circuit on, off, or PWM it, via a microcontroller input, and that might be meaningful to you since you said you were going to be using an Arduino to run other parts of the larger project.

Also BTW, linear constant current regulators are usually not a good idea for battery powered things, since they are only as efficient as the ratio Vout/Vin (thus (Vin-Vout)/Vin of the input power must be wasted as heat). However for plugged-in projects wasting power is not a super big deal.

Wasting power is wasteful, and may require a heat sink depending on how much power is wasted, but the trade-off is linear regulators tend to be relatively simple to build and have zero switching noise.


Best Answer 4 years ago

While Jacks's answer is really good for the adjustment of the current source, that does not help you with your voltage mismatch problem.

I see two options:

1) use an adjustable voltage regulator(like LM317) but you will need one with a low drop out voltage (input-output difference)

2) simpler: use the voltage drop of a diode (or two) - a standard 1N4001 silicon diode will have a voltage drop of about 0.7V. Just add two diodes between the 5V output and the current driver's input. That should bring down the 5V to roughly 3.6V. The voltage drop depends on the current flowing, but the input range of the driver should be big enough.

For testing, use some resistors as Jack said.

Hey, thanks for the answer!

How do I find out what drop a diode has? I have a few 1N4007S diodes, and I tried hooking up a load in series with one of them and measuring voltage across them, but the drop changed with voltage. Are 1N4007S diodes appropriate?

The only alternative I have (Other than buying some 1N4001 diodes is to use) are 1N5352B diodes, and I don't know what drop they have either. Would they work just as well?

The 1N4007S should be okay, it's a 1N4001 with higher reverse voltage (not needed here, but doesn't hurt, either). How to find the drop a diode has - by measuring or by reading the data sheet. Enter the part number and 'datasheet' in your favoured search engine. If found, select the one from the manufacturer, if you don't know the manufacturer or it is not in the search results, choose any. So, for the 1N4007S, I found:


Look at page 2, figure 1: forward current versus forward voltage. You can read the curve in both directions: voltage to current or current to voltage. You'll need last. Select the wanted current (that will be regulated by the current driver) on the current (Y) axis. Go right to the curve and then down to the voltage (X) axis. For 200mA, you will find about 800mV, for 300mA about 820mV. The 0.7V I gave you is just a rule of thumb for silicon diodes.

Turns out my multimeter has a setting for finding out forward voltage of a diode (It uses "voltage about 2.8V, current about 1mA", according to the manual). That came out at 547mA which is in a much more reasonable range. Is there a sensible way to figure out how much my driver will use? What will happen if I give it too high a voltage?

Yep, the 547mA @ 1mA make sense. It is not on the diagram of the data sheet, but you can extrapolate the curve a bit.

How much current will flow depends mainly on how high you set the current for the laser. Depending on the design, the driver itself may use some more mA. I would start with a 10Ohm resistor instead of the laser diode and two diodes between 5V and the driver. worst case, only few mA will flow and the voltage drop will be around 2 x 550mV =1.1V, So there will be 3.9V for the driver -> safe. Other option, 580mA will flow, the voltage drop will be around 2 x 880mV = 1.76V, so the driver gets 3.24V - still safe.

What happens if the input voltage is too high depends on the design. The driver may be a very good design and just shut off. Or - and that is more likely - some parts will let the magic smoke out.

Edit: Apparently I'm bad at holding the multimeter probes in the right spot. The voltage didn't drop but they seem to only have a 60mV drop. Should I go for 1N4001, then?

In that case, you did something wrong. According to the datasheet, the minimal voltage drop is around 600mV. Of course there has to be some load so that a minimal current flows.


4 years ago

I would find a voltage regulator, and configure it to work as a constant-current device. The output can be followed with another stage to set the voltage limit. With those specs, you could probably use two LM317T regulators. Take a look at the schematic I created in MS paint.

The LM317T regulators work by doing whatever possible (within reason and limitations) to the OUT pin to ensure the voltage between it and the ADJ is 1.25V. Use ohms law to calculate the resistance. Since R1 has to have 1.25V across it, that means the current that flows through it will always be V=I*R, 1.25V = I*R. Just reorganize it to solve for R after deciding your current. You could also use a rheostat or variable resistor for R1. As for the POT, you can use any potentiometer or resistor divider you choose. A pot will allow to to precisely set the maximum voltage.

There is one more thing to watch out for though: each regulator has a 2V drop that you need to accommodate. just make sure the input voltage is 4V higher than the set output. Also, the filter caps may be needed. 1uF caps will do fine.

constant voltage-current source.png

Two silicon diodes in series should be just fine, assuming the load is connected.

The voltage drop across a 6A05 rectifier diode is 1V, so put it in series on the output of the power supply which will drop the voltage to 4V, just make sure the diode can handle the current draw of the driver and laser, which in the case of the 6A05 is 6amps.

Of course if you need more power, put 2 6A05's in parallel which could then handle 12 amps.