Given a 12V, 100A battery with a 1000w inverter, how many amps would a generic standard european 220V, 500W appliance draw per hour from the battery itself?
I know wattage equals volts times amps, but in this system I'm wondering which of the volts I should use in my calculations. Will it draw 2.27(500/220) Amps or maybe 41(500/12) from the battery itself, or would it use some sort of a different, not so conspicuous formula?
( I'm trying to calculate how many off the grid hours I would get with such a setup, so getting this calculation can allow me to calculate based on real appliances that I plan on using, and maybe switch to a 24V battery/inverter system for thinner wires, but these are just details until I know how the math itself works.)
Best Answer
You've got a few problems/misconceptions. First there's no such thing as a 100-amp battery; however 100 amp-hour would be typical of a car-battery-sized thing. An amp-hour is an amp for an hour.
Second, you can't get 100 amp-hours out of a 100Ah lead-acid battery, at least not for very many cycles. That would be 100% depth of discharge, and that is very destructive to lead-acid batteries particularly. That's something you just have to know about lead-acid; if you use more than about 30% of nameplate capacity, you will be doing damage, shortening battery life and reducing capacity. That is a downside of lead-acid. The upside is the price.
If you want a battery that does exactly what it says on the tin, try nickel-iron.
And with a solar setup, you typically want a 3-day battery capacity to bridge you across cloudy/stormy days. It's OK if that dips somewhat into the "don't cycle every day" level, but that means an average day should only dip a lead-acid by 15-20%.
Another thing you'll need to know is that inverters take a significant amount of power simply to be spun up. (Inverters don't literally spin, but they draw standby power as if they did).
Golly. Engineering is hard. So now you need to be a little more cunning. For instance shift as many loads as possible to low-voltage DC (lights, easy; internet router easy; TV possible with many TVs; so at least you have Netflix.) For the fridge, they do make 12V fridges, but you could also "hack" the fridge with an extra thermal sensor that spins up the inverter only when the fridge needs it.
Anyway, to get back on to your question, wattage is the unit that remains consistent across various voltages (not counting conversion losses, i.e. that inverter).
Watts = Volts x Amps -- and Watts doesn't change. So if volts change, amps must too.
That means When you change volts, you must recalculate amps or vice versa.
Your 500W appliance is a 500W appliance at any voltage. So at 220V it is 2.27A and at 12V it is 41A.
Toss in another 10% everytime you change voltages. So at 12V, figure on 550W or 46A.
One more big nit to pick. You are running a "generic standard 500W appliance" I know that's an academic number, but academically...
You do not run "generic standard appliances".
You re-evaluate each load, considering carefully the wattage it draws vs. what the most efficient appliances draw. If you want to keep the inefficient appliance, how much will it cost to provision the extra power? Compare that to the cost of upgrading.
Say you have an old Westinghouse fridge that draws 300W. The very best efficient fridges draw 50W. Saves 250W, but costs $600. Well here's a question for you. How much will 250W of additional solar/battery capacity cost you? Is it more than $600? Yes. Yes it is. Goodbye Westinghouse, grats on your new fridge!
Likewise
Every load gets evaluated like that.