To accurately measure the current flowing thru a circuit using only a voltage measurement, we use a vey low ohm resistance in series with the device-under-test (the thing you're powering)

We measure the voltage across the very low ohm resistance and then use Kirchoff's law

V = I x R,

and a bit of algebra to solve for the current

I = V / R,

Where R is the value of the very low ohm resistance and V is the Voltage measure across that very low ohm resistance.

By very low ohm resistance, I mean on the order of 0.1 ohm to perhaps as high as 1 ohm, depending on the resistance of the device being powered. That is, the sensing resistance should be less than 1/100th of the resistance of the thing being powered, ie, the device-under-test, so that it's effect on the current is minimized. The reason we use this sensing resistor is that often ties, we cannot measure the resistance of the device-under-test, and so we still require a method for finding the current without knowing the resistance of the load.

best wishes

Hi,
Most simply, if your meter doesn't do current, but does measure volts and resistance, then your battery current is equal to its volts divided by its resistance.
Bob-o

#1-- is your multimeter also an ammeter? It must be able to measure amperage.

#2-- current use depends on the load. You can hook the same 12V battery up to a 12V light, or a 12V motor, and they won't draw the same amount of current. The device you're powering is the load.

Now, set up your multimeter/ammeter correctly. Many multimeters have separate jacks for different measurements. Some have a high current jack (like 10amps) and a low current jack (1 amp.) If there's any doubt about the amount of current, use the higher current setting first.

Place the ammeter in series with the load. The ammeter itself acts like a short circuit (I.E., a direct connection), so there must be a load...or you may fry the ammeter or ruin the battery. Also, see #2 above--if there's no load, there's nothing meaningful to measure.

And make sure the polarity of the ammeter is correct.

If you've done everything correctly, you should be able to get a reading.

Do you mean the maximum current the battery is able to supply? If so here is one way to do it:

1. Hook up the battery to a resistor. The size doesn't matter too much, maybe use 1k ohm or something. 2. Use your multimeter to measure the current through the circuit. 3. Use V=I*R to solve for the real resistance of the circuit.

From this you can find the internal resistance of the battery as R(real) - R(the one you used). Then use V=I(max)*R(battery) to solve for the maximum current. Let me know if you have any questions about this or if something doesn't work.

## Discussions

To accurately measure the current flowing thru a circuit using only a voltage measurement,

we use a vey low ohm resistance in series with the device-under-test(the thing you're powering)We measure the voltage across the very low ohm resistance and then use Kirchoff's law

V = I x R,

and a bit of algebra to solve for the current

I = V / R,

Where R is the value of the very low ohm resistance and V is the Voltage measure across that very low ohm resistance.

By very low ohm resistance, I mean on the order of 0.1 ohm to perhaps as high as 1 ohm, depending on the resistance of the device being powered. That is, the sensing resistance should be less than 1/100th of the resistance of the thing being powered, ie, the device-under-test, so that it's effect on the current is minimized. The reason we use this sensing resistor is that often ties, we cannot measure the resistance of the device-under-test, and so we still require a method for finding the current without knowing the resistance of the load.

best wishes

Hi, Most simply, if your meter doesn't do current, but does measure volts and resistance, then your battery current is equal to its volts divided by its resistance. Bob-o

#1-- is your multimeter also an ammeter? It must be able to measure amperage.

#2-- current use depends on the

load. You can hook the same 12V battery up to a 12V light, or a 12V motor, and they won't draw the same amount of current. The device you're powering is the load.Now, set up your multimeter/ammeter correctly. Many multimeters have separate jacks for different measurements. Some have a high current jack (like 10amps) and a low current jack (1 amp.) If there's any doubt about the amount of current, use the higher current setting first.

Place the ammeter in series with the load. The ammeter itself acts like a short circuit (I.E., a direct connection), so there

mustbe a load...or you may fry the ammeter or ruin the battery. Also, see #2 above--if there's no load, there's nothing meaningful to measure.And make sure the polarity of the ammeter is correct.

If you've done everything correctly, you should be able to get a reading.

Do you mean the maximum current the battery is able to supply? If so here is one way to do it:

1. Hook up the battery to a resistor. The size doesn't matter too much, maybe use 1k ohm or something.

2. Use your multimeter to measure the current through the circuit.

3. Use V=I*R to solve for the real resistance of the circuit.

From this you can find the internal resistance of the battery as R(real) - R(the one you used). Then use V=I(max)*R(battery) to solve for the maximum current. Let me know if you have any questions about this or if something doesn't work.