Electrical – Why is the amperage lower for higher wattage devices

electrical

Can anyone explain how electrical amperage associates to device wattage? From what I understand amperage is the amount of electrical current running through the circuit (if I imaging the AC current as a sine wave, then the amplitude is the distance from the base line to the peak of the sine wave. Though this gets a bit more complex with RMS amplitude).

As far as I understand wattage, that's the power consumption of a device in a given time unit. So for example my oven is a 12 kiloWatt oven. Which means it'll consume 12 kiloWatts every hour? Is that right?

So my question is, why is my oven on a 30 amp circuit when it's 12 kW, and my 8kW shower is on a 45 amp circuit?

Oh yes, I'm in the UK by the way so 240 Volts into the house.

Best Answer

(Both of those circuits are 240V- 2 hot wires on a double-pole breaker. U.S. power used to be 110V per leg, then 115; now it's 120.)

The oven wattage is inflated. 30A circuit * 2 * 120V = 7,200W. Any more than that for an extended period of time, and the breaker will trip. You aren't getting 12,000W to it unless your breaker has failed, and then you'd be at risk of an electrical fire.

Since the oven is heating air, it needs a lot less average power than something that's heating water- water requires a ton of energy to raise its temperature. Hence the smaller wire to the oven.