7427Views35Replies

# Battery power

I am attempting to calculate my amp need for my garage in order to set up a 12v battery system. Using the formula; Watts / voltage = amps drawn, I have hit a question. If using a 12v battery system and converting to 110v, should I be using 12v or 110v in the formula?

Example; 100 watt light bulb / 110 volts = .91 amps. Nice!

But if I use the 12v; 100watt light / 12v = 8.3 amps.

I'm believing I would be drawing 8.3 amps from the battery to power this light.

Plus the converter draw...

Please tell me which is correct?

## Discussions

correct, you'd be drawing 8.3 amps from the battery, plus the converter

Thank you! This is helping me! Let me ask another; Let's put a 15amp outlet circuit in my garage. How do I calculate the draw in amps for the battery? For calculation, figuring 15 amps for 1 hr. The 110v light took 8.3amps from the battery... Will the circuit take 120amps? (8 times the draw)

look at it this way, the watts into the converter is the same as the watts out.

for example:

1 amp at 12 volts in would result in ~0.1 amps out at 110 volts.

for 15 amps at 110v:

15 times 110 = 1650 watts

1650 watts / 12 volts (battery) = 137.5 amps

if you were to draw 15 amps at 110 ac, you would be drawing 137.5 amps from the battery.

Once again... Thank You! This makes sense. Figure the watts, then watts to amps by the battery. I'm coming up with almost 1000 amps needed for my needs... 3000 amps needed for my wants... Guess I have to work on those wants... May I push this another step? I'm looking at batteries and seeing that a standard 12v 'Marine' deep cell battery has 80AH (as I figure it 1900+amps). I've seen the specialty batteries for storage @ 2250, but 6v. Both being equal in price. Needing 2 of the 6v to make 12v, even with the specialty lasting twice as long, it seems to be a wash... I'm I missing something here? Assuming amp use & amp replacement is equal across any voltage pattern (ie: 6v, 12v or 48v) .Does it matter if the voltage of the generator is lower than the battery rating? I do assume if higher there are consequences... 6v generator to 12v battery.... With that, is a amp truly an amp? Generator is 6v, produces a amp, would it be 1amp to a 12v battery or .5amp? I've come a long way today!

first off: There's no way that your going to be able to drain 1000 amps from almost any battery, and definitely not 3000 amps. Even iff you could draw 1000 amps from a battery it'd last for like a minute and be really really really really really hot. lets say you have a 100 AH battery drawing 1000 amps would make it last for 6 minutes

The 1000 amps is a weeks usage... probably should have said that. Lights on & off daily, vacuum on weekends, radio while hanging out there... I also plan on batteries, not just a battery. Battery of choice right now 225AH and hope to start with two. This should be 4500 amps (wired to increase voltage to 12v) and only using 1000 in a week. Is this impractical? If it is, I'm really missing this....

You may want to try keep your units consistent - you're really mixing apples and oranges at the moment.

Saying that "1000 Amps is a weeks usage" doesn't really make any sense. That'd be like saying that your gas tank holds 40 miles per gallon...

If you want to calculate how much *energy* you require for a week's usage, you would typically write it in kilo-Watt hours, aka kWh - that's the units you see on your energy bill as well. (If you're a physicist, you might want to use Joules, where 1 Joule = 1 Watt second).

For example, if you need to have one 100 Watt bulb burning constantly for a week, that would be 100 Watts x 24 hours/day x 7 days/week x 1 week = 16.8 kWh.

Now, if you were to use a 110V, 100W lightbulb, it would draw 0.91 A. If you were to use a 100W bulb intended for 12 V, it would draw 8.33 A.

But both bulbs would use the same amount of power. And if both bulbs had the same efficiency, they would both give off the same amount of light. It's just that 1 Amp carries a lot more power at 110 Volts than at 12 Volts.If you're using AC, you could use a transformer to increase the voltage but decrease the current, or vice versa. You're essentially just "translating" one set of voltage and current to another, without too many losses. Same principle holds for AC to DC conversion or vice versa.

In a scientist's ideal world, all batteries would be labeled with how much energy they can store in Joules. In practice, batteries put out a relatively fixed voltage, so their capacity is listed in Ah or mAh, rather than Wh or kWh. However, you can't compare the Ah capacity of two batteries, unless they have the same voltage (again, one Amp at a higher voltage drop carries more power than one Amp at a lower voltage). Also, the Ah capacity of a battery has nothing to do with how *fast* you can discharge the battery - only with how much energy it can hold.

Patrik, thank you for the response! Let me ask you something. I now have 2 6v solar lights in the garage that are doing they're particular job (stopping me from walking into things) very well. This is what's leading me to turning the garage 'Green'. I thought of going to a 12v system and upping the power figuring I would be increasing what my 6v lights are doing right now and purchasing new 12v lighting. I am not certain this is what I really should do for the long term goal, which is to 'grid connect'. If indeed I ever reach my goal, then it seems more practical to set up with efficient 110v lighting. So with this in mind and trying to determine what I might use in a week use of amperage out in the garage, I came up with 1000 amps would be used. Nothing would be used constantly, just that much would be used in a weeks time. By knowing this, I hoped I could determine what kind for battery storage I would need. I came up with the 2- 6v 225AH batteries. Costs are within my budget and I thought they would do the job. 225AH , figuring I can not use more than 50% of those available amps per hour, I need to adjust either my battery bank or my usage. Question 1; Am I still shooting too high? Is 50% too great a number? This lead me to the next step, in my mind. How many amps can I realistically replace each week to make up my usage and recharge the batteries? I am looking at both wind & solar and calculating gains vs. costs right now. I have to say I'm leaning towards wind... I'm gathering parts now to build a test turbine and see if I'm figuring everything right. With no guidelines (I could find) to go by, I thought this would be the best way to start. Figure a weeks worth of usage. Find a source to draw from that would work. Find a way to replace the energy used. I now realize there's more to this, yet I think I'm on the right path. Any comments & please, suggestions are not only welcome, but wanted.

Again, saying that you would use "1000 Amps in a week" does not make any sense. It's just as nonsensical as saying "I travel a total of 1000 mph in a week" - the units just don't match up. Amps tells you how much current is flowing

right now, whereas what you want to estimate would also need to include the amount of time time and the voltage.Do you mean Amps x hours, at 12Volt, for example?

Could you post how you arrived at that one-week estimate - maybe we can unravel where you're going wrong...

Okay, Now I'm ready!

12v battery bank, converted to 110v current.

Weekdays;

Lights; 2- 20 watt motion detector spots running 1 hr constant twice a day

3.33amps morning & evening = 7amps daily x 7 days =

49 amps( I currently have 2- 6v solar powered spots in there now and they work

great! They may stay..)

Weekend

1- 15 amp appliance figured at 45minute actual use cycle = 103 amps

Say I'm ambitious and do this for 4 hrs = 412 amps

Do this Sat & Sunday 412 x 2 =

824 ampsGo 1 step further and say it's a gray day and I need lighting

4- 20 amp lights = 6.7 amps per hour + my appliance 103 amps

= 110 amps per hour

I take 50 amps (lights) + 825 amps (appliance) + 60 amps (switched lights)

I get 935 amps used in 1 week

I tried to add 10% converter loss and rounded all numbers up for math's sake. I also rounded to 1000 amps to make up for unknown usage or losses.

Trying to stay under 50% of the battery capacity per hour, I assume is correct usage... I see my highest usage is 110 amps in any 1 hour.

Using a 225AH battery @ 12volts, I figure I just make it.

Now that I know (think I do) what my usage would be, then I should be able to determine how much amperage I must generate in a week to make up for my usage.

Does that clear it up?

How close am I?

I do realize that another 12v 225AH battery would be better, but will this work with one?

For the lights, it looks like you're actually calculating

49 Amp.hours @ 12 VAs I mentioned before, you're probably better off using standard units like kWh. In your case, 2x20W x 2h/day x 7 = 560 Wh.

I'm having a hard time translating this part though, which takes up the majority of your energy usage:

1- 15 amp appliance figured at 45minute actual use cycle = 103 ampsI assume that's 15 Amp at 110 Volt, or 1650 Watt? That would take 137.5 A at 12 V - so 103 Ah at 12 V (or 1.24 kWh) for 45 minutes of use; 412 Ah at 12V for 4 session of 45 minutes; 824 Ah at 12V for two days of 4 x 45 minutes of use (9.9 kWh)

Then for the switched lights - I assume that's a type, and you mean 4x20

Wattlights? Also, there is no such thing as "Amps per hour" - that makes as much sense as saying that a jet flies at 2 Mach per hour. 80 W light at 12 V = 6.67 A. Times two days of 4 hours use = 53 Ah at 12 V (640 Wh).Here's how I would calculate this:

2 x 20 W lights x 2h/day x 7 days = 560 Wh

1650 W appliance x 45 min/h x 4 h x 2 days = 9900 Wh

4 x 20 W lights x 4 h x 2 days = 640 Wh

total energy usage per week: 11,100 Wh, or about 11 kWh

That means you would need at least a total of 1000 Ah in 12V batteries, or 2000 Ah in 6V batteries.

Also,

don't make assumptions about how fast you can discharge a battery!The maximum discharge rate varies with the type of battery. For portable batteries, this is often expressed as a C-rate. E.g. a 1000 mAh battery would provide 1000mA for one hour if discharged at 1C rate. The same battery discharged at 0.5C would provide 500mA for two hours. At 2C, the 1000mAh battery would deliver 2000mA for 30 minutes. Sealed Lead Acid (SLA) batteries often have a much lower maximum discharge rate - 0.2C or less. For automobile and marine batteries, the max discharge rate is often given in Amps instead.I think I'm getting close here.... at least closer. At very least I better be for I'm probably frustrating the daylight out of you. Thank You for hanging in there! I thought a 225Ah battery, rated at 20 hrs meant it had 4500 amps or could draw 225 amps each hour for 20 hours. Being my highest load is110 amps in 1 hours time, meant I could run the 225AH battery (20 hr rate) 40 hrs before it was totally drained. What I believe I'm now seeing is that if I drew 110amps in 2 hrs the battery would be dead? Thus the 1000AH battery to fill the wants I listed.

I'm not sure what you mean by 225Ah rated at 20 hours means, but I'll explain AH better.

Lets say you have a 100Ah battery.

This means you could drain 50Amps for 2 hours (2X50=100)

This means you could drain 25Amps for 4 hours (4X25=100

you could drain 1Amp for 100hours (1X100=100)

See where I'm getting at?

Now I'm past what a battery will do (thanks to you all), I've reached the point of determining the generator capability to replace what I use.

For wind experimentation I purchased a Johnson 220VDC / 5.5 amp motor. I assume this means 5.5amps @ 220 VDC.

This being the case I need to now figure the amps @ 12VDC.

Now figuring Volts x Amps = Wattage

220 x 5.5 = 1210

I figure this a 1210 watt generator.

I assume dividing that by 12VDC is wrong...

100 amps... na, can't be right...

So I thought 220 by 12 = 18.34

Now I either use that to divide into the wattage or the amperage.

Wattage = 66 watts / divide by 12 = 5.5 Really cool!! but probably still not right....

Divide into Amperage = .3 amps... This sounds real...

What is the correct math???

No, you had it right the first time. 5.5amps @ 220 VDC would be 1210 Watt. If you could transform that to 12V without significant losses that would indeed be 100A @ 12V.

Now, mind you, a 1.2kW motor is quite a whopper! Given your energy requirements of 11kWh per week, you only need an average of 11kWh/7/24 = 66W of power on average. Wind power is very inconstant, so you'll need to multiply that by a significant amount, because your wind turbine won't be generating at full power all the time. There's probably resources where you can look up how variable the wind is in your area, and what multiplication factor to use. I haven't built wind turbines myself, but 18X - which is what you have here - seems a bit much.

The other issue is the size of the wind turbine. Should be fairly straightforward to look up the average size for a 1.2kW peak power wind turbine.

Chances are you can't afford to have such a huge turbine in your back yard. You may want to go with a mix of wind and solar, which would also help average out the fluctuations in both.

Even more important would be to see if you can reduce your assumptions on energy requirements! Replace those 20W (CFL?) bulbs with some 5W LEDs. Or see if you can buy a more economical version of that 1.6kW appliance you need to run 6 hours a week - you'll probably save a bundle on a smaller wind turbine / solar panel / battery rack...

Thank you Patrik! Now for the curve! I've built this and have it up and spinning. This is the real dilemma. I agree that if I was turning the rpms (aprox 7,000) to obtain the full 220vdc the math is correct (thank you for confirming that), but if I'm only turning the rpms (aprox 400)to obtain 12vdc (which I am), then I figure I'm only reaching a fraction of the amps... This brings me to what I believe is only .3amps... I am shootting for a final product, as you say, about a 70 watt generator (happy to see how close I was). I don't see this little project turbine doing it, but I have proven to myself I can do this! Given this information, is .3amps right?

Welll... it's not quite that simple.

Power is not just dependent on speed, but also on torque, i.e. how much force it takes to rotate the generator. So, if we ignore any conversion losses:

Power = Torque x Speed = Voltage x Current

Voltage tends to be somewhat linearly related to rpm, although at a higher current you may get some losses due to internal resistance of the generator as well. And depending on how much current you're drawing from your generator, you'll need to apply more force to turn the generator at a specific rpm.

How much speed versus torque you get from your wind generator depends on the turbine design: number of blades, angle of attack, etc. You can also trade these factors off against each other by using a gear box between the turbine and the generator.

Either way, you should not assume that you'll get a nice 12V out of your generator. Wind speed, and thus the output voltage and amount of power generated, will vary drastically over the course of a week.

Right now I figure to try a 5 to 1 gearing and see how it spins. Since I have not placed a draw on it as of yet, I assume this is where the added torque comes into affect. I realize that there are losses involved with length of wire, generator & charge controller, though I haven't a clue how much it will be, I assume this will add torque. Right now I'm attempting to see if I'm even close in my calculations without the losses involved. I know, mounted 6' off the ground, in a fairly sheltered area, I can produce 6-10 vdc in a 3-8 mph wind. I considered that good for what I have right now. I don't know the relative amperage... That's where I came up with .3 at 12vdc. With that amperage / voltage being so small, I'm having difficulty coming up with a draw that will be less than or equal to on hand. So, my thought is to do my gearing under my current theory, increase what I believe to be the wattage and hook a draw of equal or less to it and either be surprised or put out the fire.... Here's my theory, ya it's going to drive you nuts, but you've been real good to me and I feel the need to share it. I can produce 6-10vdc in my winds, in a difficult spot (understand, this is not the location of it's final destination) . I assume this is only .2 amps using an average speed and producing 8vdc. 5 to 1 gearing would potentially bring this to 35vdc and 1amp. This equals a 35 watt generator. Converted to 12vdc could equal 2.9amps. 2.9amps @ 12vdc is good to me! As I check my wind speeds over the cycle of months I've determined a 6 hr per 4 day week as a minimum average with a 10 hr per 5 day week as a maximum. I am lucky enough to have a weather station that tracks this within 3 miles of my home and I'm in a better location! That could give me 835 watts to 1740 watts in a weeks period. Considering my high estimate goal is 2000 watts I'm getting pretty close. I also realize that I will have weeks where there is no wind benefits, but if this little generator can do what I think it will, I see 3 of these up and spinning way before I invest into solar or a 'bought' wind turbine. I feel I'm close and I'm not hearing you say I'm way-off... just telling me I'm not calculating all the losses involved. I understand and feel pretty good. With only $27 invested & the next step costing me $0, I don't see a reason not to continue (I have plenty of hose & water...). If the gearing theory works... and I do produce the 35 vdc... I do have something I can hook up to put a draw slightly less than to the production. Really hope it doesn't stop it dead in it's tracks......

I think you're getting kinda close, to be quite honest I didn't read your whole post otherwise i'd get confused.

Here some figures for ya:

Examples:

lets say you want to light a 60 watt lightbulb for 5 hours. 60 watts / 110 volts = ~0.55 amps (I rounded for simplicity). This means the lightbulb takes 0.55 amps at 110 volts. Now, we are running this off of a 12 volts battery, so we do 110 volts / 12 volts ~ 9.2 times, so we now multiply the 0.55 amps by 9.2 which equals about 5 amps. So, we are now at the point that we can run a 60 watt lightbulb off a 12 volt battery (going through the converter) and it will suck about 5 amps from the battery. Now, if we leave it on for 5 hours, then we multiply 5

HOURSX 5AMPSto get 25AMP HOURS. So, in conclusion, if you only wanted a 60 watt lightbulb on for 5 hours then you need a 25 amp hour battery. also, amphour is abreviated to AH.Another point of refrence: 1000maH = 1AH

Just incase if you are wondering

another example:

you want to run a washing machine for an hour. It takes 5 amps at 110 volts. We multiply 5 amps by 9.2 again (for more exact calculations, use the ratio of 55:6 (same as 110:12), the decimal is 9.1666666.......) and you get 46 amps. So the washing machine takes 46 amps from the 12 volt battery, since we are running it for 1 hour then we multiply 46 times 1 which equals 46 AH. So running the washing machine for 1 hour drains 46 AH from the battery.

Another note, make sure your converter can handle the watts, make sure not to exceed the watts at any one time.

Please read my response to Patrik... I'm hoping I'm getting closer to this. My confusion seems to be directly related to the battery's available amperage and this 20hr AH rating.

For lighting, I personally think LEDs are the best ever. They're very efficient and last 100,000+ hours (and they actually do last that). The problems are that they are expensive and don't really put out much light. I would look into them, even if you decide you don't want them.

Sorry, I forgot voltages, most use about 3 volts DC, just hook 4 in series and call it a 12 volt. About light quality, a standard "white" LED is actually a blue LED with yellow phosphor, resulting in bad light quality. If you can find an "RGB" LED, just hook all the terminals together in parallel and the light will look much better.

but red, green, and yellow each have different voltage operations.

Not always. You can find them all the same, even if shorter wavelengths are usually higher voltage.

assuming you wire them in paralelle. you get 225*2, or 450 AH. thats also 2700 amp minutes, or 1620000 amp secondes. at 12 volts, thats 1.944*10

^{7}watt seconds. of course, if you were to discharge the batteries too fast, they would over heat and die. also, how would you charge the batteries back up?

wait, hang on. if you then converted the 12 volts to 110, the watt a second figure is still good, but about 10% will be lost in the inverter.

Okay! Now the curiousity is how fast is too fast. If the battery is rated at 225AH (20 hr rate), then I assume 50% of that is safe. Or can I look at the full 225AH as safe? Next, I would be wiring in series to achieve the 12 volts, keeping me at the 225AH. Correct? If that is right, then I would think the usage of 225AH would be safe..... Thank you for the inverter info! I've read some very confusing info on this. 10% is a reasonable factor to use? Charging the batteries is the second side of all this. Currently I'm analyzing wind speeds in my area to see if a wind turbine is feasible. Solar is the other option. I believe I have a reasonable site for wind and a good site for solar. Other options are welcome!!

assuming both batteries are 12 volts 225 ah, wiring them in series (positive to negative) would give you 24 volts 225 ah. wiring them in parallelle (positive to positive) will give 450ah. im not sure what a safe discharge speed is sorry. depending on the inverter, 10% should be reasonable.

Now I'm past what a battery will do (thanks to you all), I've reached the point of determining the generator capability to replace what I use.

For wind experimentation I purchased a Johnson 220VDC / 5.5 amp motor. I assume this means 5.5amps @ 220 VDC.

This being the case I need to now figure the amps @ 12VDC.

Now figuring Volts x Amps = Wattage

I figure this a 1210 watt generator.

I assume dividing that by 12VDC is wrong...

100 amps... na, can't be right...

So I thought 220 by 12 = 18.34

Now I either use that to divide into the wattage or the amperage.

Wattage = 66 watts / divide by 12 = 5.5 Really cool!! but probably still not right....

Amperage = .3 amps... This sounds real...

What is the correct math???

I would go for wind power if at all possible. It should be much cheaper per amount of power than solar if you get everything right.

Hey why on earth do you want a battery powered garage? It seems like a really roundabout and possibly unreliable way to do things, why not make an extension of a circuit, of get an electrician in to add a new circuit on the breakers, my box has loads of circuits divided up into rooms and still has four spare breakers...

I have solar powered lights out there right now. The goal is to keep the garage as far off the grid as possible. I succeed there, then why not move the idea into the house!!

Oh right, that's a cool idea, however batteries wont really get you that far because they'll be dependant upon the grid for charging... If you have wind then get a turbine, if you have a stream on your land a water wheel...

both are correct. They are both the same amount of power, 100W It draws 8.3 from the battery

Thank you! Still a bit confused thoiugh... I'm probably thinking this out too much, but really want to understand this. All power starts at the battery. Once converted to 110v, I'll use .91 amps. From the battery it will take 8.3 amps to get me to the 110v draw. All in all I've used 9.21 amps to power this light? Right now I'm hoping for just the 8.3amps....