# Reducing voltage in a circut for the layman.

Let's say I am building something: (This setup is completely imagingary, so I will use strange numbers.) I have a battery box that has 4 AA batteries rated at 1.5V and 2.7amps (according to wikipedia).

Now lets say I want to hook that up to a LED that has a max 4 volt 20mA forward. I know that I need a resister. I know that V = IR. What I can't wrap my head around is this: How do I change that equation so that is tells me how many Ohms resistor do I need? Because according to that equation if I increase the resistance, I increase the voltage. Becuase were talking about the same source. Do I need to use two V=IR equations and solve for one then place that value in the original.

I just can't get it....maybe I have had too many beers since graduating, or using algeba? I need a layman's way of understanding this...(exasperated sigh...)?

active| newest | oldest20ma Divided by 1000 = .02...

3 led's 2.4 x 3= 7.2 volts

12v (car supply) subtract 7.2 total volts of led's

12 - 7.2 = 4.8 volts

4.8v / .02 = 240 ohms

So I need a 240 ohm resister.....do i need a certain type. do I need a certain watts

Take your battery voltage (9volts) and subract from it your LED voltage (3 volts), that leaves you with 6 volts. Divide your remaining voltage by the LED amperage (.02). So you would have 9 - 3 = 6 divided by .02. That gives you 300. You would need a 300 Ohm resistor.

I=V/R

If you've got 4x1.5V in series, you've got 6V, you want 20mA.

Take out 4V forward across the LED, you'll have 2V across the resistor.

V=2, I=0.02

R=2/0.02 = 100 Ohm

L

L