# What's up with Amps, voltage and current?

So I'm having trouble finding out where amps come in. Typical batteries (AAA,AA,9v,3v) don't say the output amps, just output voltage. But things like RC batteries, and wall transformers have a specific amperage output.

So say my LED and 9v battery. I know I need to use a 150 ohm resistor. Since resistors limit current, then is there still 9v going to my LED its just the amps/current that being lowered before going into the LED?

If resistors limit current, or in its unit of measure, amps, then my LED is still getting the supplied voltage just a lowered amperage?

How do I know how many amps my projects are drawing? How will I know if my DC wall transformer with 12vDC and 1 ma isn't going to fry my circuit? How would I know if 1 amp is too much or not enough for any my circuit?

Please help, none of this makes sense,

~electricloser

## Discussions

Oh well, you mentioned current and voltage so often and totally left out the missing link: the resistance.

Key to all that is Ohm's law that links voltage, current and resistance:

U=R*I or I=U/R or R=U/I

U=Voltage in Volt

I = Current in Ampere

R = Resistance in Ohm

Your batteries and wall warts will supply a certain voltage 1.5V, 9V for batteries, 12V etc for power supplies. As your other question shows, that are nominal values, the real voltage can be higher )fresh battery, unloaded PSU) or lower (almost empty battery, badly designed PSU). For simplicity, lets go with the nominal value, just keep in mind that it is somewhat theoretical.

So, what does that mean for your 9V / 150Ohm example?

I=U/R 9 V / 150 Ohm = 0.06 A = 60 mA

So, you see, it does not matter if the supply has a higher potential current than the load needs. The load will take what it needs.

On the other hand, will you get unlimited current, if you just short the terminals? Like in

12 V / 0 Ohm = inifinite A ?

Well, no. The current you will get depends on the supply. Any supply has an internal impedance (impedance is the fancy word for resistance once you get to AC) So you have in external resistor that you can get close to 0 Ohm and an internal resistor that depends on the supply itself. Resistors in the same circuit just get added together.

So, the short circuit current depends on the type of power supply. 9V block batteries have tiny cells with high inner resistance. They can only supply limited current. The bigger the battery, the higher the current (AAA < AA < A < B < C < D << car battery ;-) ) For wall warts etc., it depends on the design.

If you want to learn more, search for Ohm's law, and try to understand the voltage divider.

Ok well what about this, where does the LED come in?

You mentioned "So, what does that mean for your 9V / 150Ohm example? I=U/R 9 V / 150 Ohm = 0.06 A = 60 mA"

Ok, well where would the LED voltage drop come in? I'm assuming since a standard LED pulls about 3v, then I would redo my equation by subtracting the LED drop voltage from the 9v, then put that in my equation. Is that right, or the horribly wrong?

I=U/R 6v/150 ohm=0.04 = 40mA. This seems logical. Please help me understand where this LED would come into play. Also, what if I had a whole circuit put together? How would I measure the whole circuit's resistance???

Thanks!

Oh, one more thing, since the circuit above is only pulling 40mA, is the 1A output gonna hurt it? From what you said, I understand that if you pull to much that's not good. What too little?

Well, the LED is a bit more tricky, as it is a semiconductor and has no fixed resistance. A LED has something called forward voltage that depends on the current flowing through it.

So first, you decide how much current you want to flow through the LED. For a standard LED 20mA is a good starting point, for a low current one, you can try 4mA. For high power LEDs you'll need more of course. Check the data sheet.

It's the data sheet too, where you can find diagrams that show the relation between current and forward voltage. If you don't have a data sheet, you can use some rules of thumb: red ~ 1.6V, yellow, green ~2.4V, blue, white ~3.4V.

So, what to do with this forward voltage? You subtract it from the supply voltage and calculate the resistor to make it take care of the rest of the voltage.

Let's try an example: 9V battery, green LED, 20mA, 2.4V Uf,

Circuit is just the battery the LED and a resistor.

Voltage at resistor: 9V-2.4V = 6.6V

The current is the same everywhere in a circuit, so we want 20mA flowing through the resistor, how big has it to bee?

R = U/I = 6.6V / 0.02 A = 330 Ohm

How much power gets wasted at the resistor?

P= U*I = 6.6V * 0.02 A = 0.132 W, a 1/4W resistor should do.

See LED_circuit for some explanation of native english speakers.

As I already wrote, the capability of the supply to source a lot of current yields no problem (as long as you don't make a short). The mains wiring in your house can supply some amperes of current, but will even power the tiniest night light (consuming some Microamperes) without a problem.

Current is measured, like you can measure voltage. Most "voltmeters" double as "ammeters"

What you ACTUALLY read on a wall transformer is MAXIMUM output current. You must NOT draw more than that limit, but the supply is good for UP TO the maximum.

The sum of all the voltage drops in a circuit is ALWAYS zero.

So, starting at the battery you have Vbat = Vled + Iled * R

If you re-arrange that, Iled = Vbat-Vled/R, so I depends on the led voltage.

BUT with an LED, the LED voltage doesn't obey Ohms law, and to a reasonable approximation, Vled is constant, at least when the thing is actually giving off light.