Electrical – Inverter Power draw from 12V battery

batteryelectrical

Given a 12V, 100A battery with a 1000w inverter, how many amps would a generic standard european 220V, 500W appliance draw per hour from the battery itself?

I know wattage equals volts times amps, but in this system I'm wondering which of the volts I should use in my calculations. Will it draw 2.27(500/220) Amps or maybe 41(500/12) from the battery itself, or would it use some sort of a different, not so conspicuous formula?

( I'm trying to calculate how many off the grid hours I would get with such a setup, so getting this calculation can allow me to calculate based on real appliances that I plan on using, and maybe switch to a 24V battery/inverter system for thinner wires, but these are just details until I know how the math itself works.)

Best Answer

You've got a few problems/misconceptions. First there's no such thing as a 100-amp battery; however 100 amp-hour would be typical of a car-battery-sized thing. An amp-hour is an amp for an hour.

Second, you can't get 100 amp-hours out of a 100Ah lead-acid battery, at least not for very many cycles. That would be 100% depth of discharge, and that is very destructive to lead-acid batteries particularly. That's something you just have to know about lead-acid; if you use more than about 30% of nameplate capacity, you will be doing damage, shortening battery life and reducing capacity. That is a downside of lead-acid. The upside is the price.

If you want a battery that does exactly what it says on the tin, try nickel-iron.

And with a solar setup, you typically want a 3-day battery capacity to bridge you across cloudy/stormy days. It's OK if that dips somewhat into the "don't cycle every day" level, but that means an average day should only dip a lead-acid by 15-20%.


Another thing you'll need to know is that inverters take a significant amount of power simply to be spun up. (Inverters don't literally spin, but they draw standby power as if they did).

Say you want to power a refrigerator. Very easy load; 2000W startup surge, 100 watt operating (25% of the time) and 0 watts idle. The startup surge is relevant to sizing the inverter, it's not a burden on the battery. So you get out your sharp pencil and say 100 watts x .20 = 20 watts average, x24 = 480 watt-hours (40 amp-hours) per day. Easy peasy right?

Hold on. You don't know when the fridge will want to cycle. Let's just armwave that an inverter takes 2% of its max load as vampire loss simply to be spun up. That 2000W inverter therefore takes 40 watts (better inverters take less). Oh, snap. Now we're talking 960 watt-hours, 80 amp-hours per day. Plus the fridge. So we're now tripling the size of our pack! Whoops!

Golly. Engineering is hard. So now you need to be a little more cunning. For instance shift as many loads as possible to low-voltage DC (lights, easy; internet router easy; TV possible with many TVs; so at least you have Netflix.) For the fridge, they do make 12V fridges, but you could also "hack" the fridge with an extra thermal sensor that spins up the inverter only when the fridge needs it.


Anyway, to get back on to your question, wattage is the unit that remains consistent across various voltages (not counting conversion losses, i.e. that inverter).

Watts = Volts x Amps -- and Watts doesn't change. So if volts change, amps must too.

That means When you change volts, you must recalculate amps or vice versa.

Your 500W appliance is a 500W appliance at any voltage. So at 220V it is 2.27A and at 12V it is 41A.

Toss in another 10% everytime you change voltages. So at 12V, figure on 550W or 46A.


One more big nit to pick. You are running a "generic standard 500W appliance" I know that's an academic number, but academically...

You do not run "generic standard appliances".

You re-evaluate each load, considering carefully the wattage it draws vs. what the most efficient appliances draw. If you want to keep the inefficient appliance, how much will it cost to provision the extra power? Compare that to the cost of upgrading.

Say you have an old Westinghouse fridge that draws 300W. The very best efficient fridges draw 50W. Saves 250W, but costs $600. Well here's a question for you. How much will 250W of additional solar/battery capacity cost you? Is it more than $600? Yes. Yes it is. Goodbye Westinghouse, grats on your new fridge!

Likewise

  • A 1-connection washing machine that heats its own water? Nope, you use a 2-connection washing machine and a solar-thermal system, and wash your whites on sunny days. Or stop wearing white :)
  • Plain old beige-box gaming PC with an 850W power supply in it? Nosirree, it'll cost $12,000 in additional solar-battery to power that, and that'll pay for a new Mac Pro 3 times over. So you shop for a PC optimized for energy use and doing what you need, e.g. Mac Mini + external VPU that you only switch on while gaming.
  • 55 watt T5 fluorescent light you already have, vs 44 watt LED light for $50 -- 11 watts saved - except only 10 minutes a day, so 0.07 watt on average. Not worth the $50, don't upgrade.

Every load gets evaluated like that.