646Views19Replies

# Laser Driver with Limited Parts? Answered

I have been working on a laser engraver for some time now, and have just recently gotten the laser parts.  I need to make a laser power supply, which I currently have a 5-volt fixed regulator which turns 15v 380ma into (obviously) 5v with roughly 400ma (value on label is not completely accurate).  I know that diodes will drop ~.6-volts, which I have a line of about 7 of them for testing, giving me 2-volts with the same current.  The problem I'm having is, I put proper resistors into the circuit to give 200ma to the diode, but every time I attach the laser to test the current it always reads 17ma on my volt meter.  Why is this?  How am I able to build a proper laser driver with limited parts (limited to a large supply of resistors and mixed caps, no adjustable regulator)?

Tags:

The forums are retiring in 2021 and are now closed for new topics and comments.

What are the diodes requirements ? They usually need a constant current supply - and spike free - lasers are very easy to kill with spikes.

Here's where I bought it from - http://www.ebay.com/itm/Nebulastar-808nm-200mW-High-Power-Laser-Diode-5-6mm-TO-18-Package-/151131716132?pt=LH_DefaultDomain_0&hash=item233026fa24

Turns out I grabbed the wrong link, this is my diode - http://www.ebay.com/itm/Brand-New-808nm-300mW-Taiwan-Laser-Diode-DIY-Laser-Lab-/261075183935?pt=LH_DefaultDomain_0&hash=item3cc94b293f

Change the maths for the 317 circuit, R=1.22/0.34 and R=5/0.34 for the 7805 version.

Don't forget to work out the power in the resistor.

Assuming you are reading the amp meter correctly.. could you be trying to cut the voltage down from 5 volts down to 2 volts because the schematic you are working from shows the laser diode has 2 volts going to it???? If that is what you are doing, then you are not understanding the circuit and basic ohms law. Let me explain about placing resistors to feed any LED diode....

For example... if you attach an LED or Laser diode directly to the 5 volt source, then the amperage may exceed the design ability of the LED. For example, it might go to 100 milliamperes when the LED is only designed to handle say 29 milliamperes. Therefore, we add some resistors in SERIES with the LED to reduce the milliamperes going to the LED. If you add the proper resistors... then the amps and volts going to the LED will be OK... not too high, not too low. So... maybe you are using the string of diodes to kick the voltage down to 2 volts hitting the laser diode? That is wrong thinking. Just put the 5 volts to your resistors in series with the laser diode. Keep your milliampmeter in series with the circuit also and set it for a low amperes range. You might want to repost your question with a diagram of your circuit and your meter placement so we can answer you more accurately. My answer is a wild guess.

I'm not completely understanding what you are saying. What I am getting from it though is you are saying to add resistors to drop both voltage and current? I thought it only works for one or the other. The diode says it is rated for 2v, so wouldn't a 5v supply kill it? I do have a resistor to drop the current to below 200ma because that's the laser's current cap.

Yes. A 5 volt source connected directly would kill it. That is the purpose of the resistors, they limit the amperage going through the LED or Laser diode. I think what you are doing is using series diodes to get down to 2 volts... then also adding resistors to limit the current. That's overkill. All you need to do is use a proper resistor to limit the amperes. The 2 volts across the LED or laser will take care of itself. Here is a test you can do. Connect your resistor to the 5 volt supply. measure the amperes going through it (We are not connecting the laser diode at this time). This is just a test to see how many milliamperes flow using just your resistor. The actual circuit will only have 3volts across the resistor and 2 volts across the laser diode (using the data you mentioned of 2 volts). You must know the amperage the laser can handle. Do not exceed that amperage. Calculate your expected milliamperes using ohms law and the resulting answer will be whatever ohms resistor. Keep in mind, the resistor must be able to handle the wattage also. Resistors come in many wattages. Just multiply the volts (across the resistor) times the amps flowing through the resistor ... equals the watts. If you calculate 1.25 watts flows... you should use one slightly higher wattage than that. otherwise it will overheat and burn out. I am pretty sure you just need to eliminate the series diodes in the circuit you built. Also, if the laser diode gets hot, it should be attached to a heatsink or it will overheat and die. If the laser diode is expensive, I suggest you start out by goofing around with LED's first for a few days. When you get the hang of that .... then move on up to powering up the laser. Led's are very cheap and you can burn a few out without hurting your pocket very much.

When calculating the required resistors, what voltage value do I use? Is it the 3v across the resistor, the 2v across the diode or the full 5v from the supply?

Use the voltage across the resistor and the amperage through the resistor to multiply together equals the wattage the resistor will dissipate. Then pick a resistor of higher wattage than that... so the resistor will not burn out.

I'm not trying to calculate the wattage, I need to know which volt value to use when calculating current with ohms law.

No, you HAVE to consider the wattage too, or you may have a burned out resistor and a dead laser.

I also understand a cap is needed to protect the laser from power spikes? How do I know what value cap to use?

No, you can't affect one OR the other, it affects both. That's the point of Ohm's Law.

What you need is a constant current source, capable of delivering 240mA, forget the voltage, control the current. Put a 22 Ohm resistor from your +5v regulator output, to the the "GND" pin of the regulator. DO NOT CONNECT THE GROUND TO 0V. Connect it to the anode of your diode. Connect the cathode to 0V.

Alternatively, get an LM317, and do this. Forget the pot, and replace the R with a 5.6 Ohm resistor (0.5Watt)

So what you're saying is to take the 5v regulator, attach the +input to the +supply, the +output to a 22ohm resistor to the gnd on the regulator, and the gnd pin on the regulator to the +diode? When I do this (I didn't plug the laser in yet to be safe) the voltage value is still 15volts. Shouldn't it be 5v?

You didn't put a load on the output did you ? What do you THINK will happen ? This is a constant CURRENT regulator. The regulator is trying to drive 240mA from its output. To do that, it has to apply"enough" voltage to make it happen. You have an INFINITE resistor on the output. To drive a finite current, the output voltage has to be infinite, but it can't be, its limited to the input voltage.

The trouble with the 7805 method too, is that the power dissipation in the resistor is very large. Better to use a 317.

PS, try shorting the gnd pin THROUGH your multimeter, on the 10A range.

How do you, measure ma current with a Volt meter ?

If you are using the ma setting on a Multi Meter... You must understand
those ma meters have a substantial series resistance in addition to
whatever you are measuring,  which will be less then what is expected !
You don't say what your 15V supply is ?
Or the laser diode voltage and Current requirements ?

Sorry, I just called it a volt meter. It is actually a multimeter. I said the power supply is 15v 380ma, what more do you need to know from it?

Is the voltage regulator a 7800 series?

They can be wired as a driver.

What you want to do is a circuit something like this.

It will need to be tweaked to your laser.

Since I don’t know exactly what you have I can’t be more precise but this circuit will drive up to two amps 5 volts and if you tweak it right it will drive miliamps.